var/home/core/zuul-output/0000755000175000017500000000000015146027162014531 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015146042370015473 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000320721415146042307020261 0ustar corecoreDikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs$r.k9Gf ?K{b}Wߟ/v]qo_fZsZ-juC4%_̾zׇϘէoW7_~uyi{|||F^lWo%vz_/-~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fi9qwVGtwL*upr"hA ó/碓@e=Vv!hXoCDiQJxRsL]+,=M`{0w|]?y#~7>d,Qw;C}F][UVYE NQGn0Ƞɻ..ww](o./Y<͈#/5O _H ' C9yg|O~ €'}5S0,!yrq%a:y\t悴暳unL h%g$Ǥ]>ny_` \r/Ɛ%aޗ' B.-^ mQYd'xP2cyڈL|Z΢rZg7n͐AG%ʷr<8 2S>h?y| (GClsXT(VIx$(J:&~CQp[ۗ/RIoӸUKҳt17ä$ ֈm maUNvS_$qbEY QOΨN!㞊?4U^Z/ QB?q3yv.اeIʷ"X#/W~^ 9^oķ[Zp0?K]UIĀg)4 BR4t *퇄u p}du nz/آs;DPsiv=HoN λC?; H^-¸Z( +"@@%'0MtW5oӿ":7erԮoQ#% H!PK)~UozxQV^peGVق?>6jHSJ Jno#ˏl_}?1zngbߧ\I;t.U&hoP~(*טjLo(}?=ZkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIj/h}e¥v`tYϞnTq&zcPj(PJ'ήYYFgGoQ ȎGxꆜd`V)F5d,0SSNK9ް4:ÒozsB<^/鄌4:B%cXhK I}!5 YM%J¶gʥVЇsfjҠƞo6xdy8_n ׫⹚Y˜cnPVBH9sI` v2vW G&ʐƭ5J; 6M^ CL3EQXy0Hy[``Xnv635*V I{a 0QiOEN_G{P;KHz"GW- >+`قSᔙD'Ad ѭj( ۞O r:91v|ɛr9lw`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ9 %lg&:2JC!Mjܽ#`RJX4Q2:IGӸۡshN+60#:mufe߽aY.hǑ sVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a__y[z`rVA,f=A}h&fFAtĘ5dw}EaޭVZ=c}!‹O,ƍwͩ?9}5oF2(Y}I7^{͗y~x^|^Ŝt;yȾt{|2hKNh`0,9Gɺɔ+'Х[)9^iX,N&+1Id0ֶ|}!oѶou|8Qz:^S-7;k>W~L><~-W].EoV%#?'W۱bWЀA=YUt4NרnN@甗yLkS :}%J6TIsbFʶ褢sFUC)(k-C"TQ[7j39_WiZSس:$3ɾ3,<S1wg y &SL9qk;OP> ,դjtah-j:_[7Wg_0K>є0vNۈ/:= T u)1 QLLj`K -D,(7N*,< JDA?VǞ©H\@mϛ~W-ce{0d8}gp/G\2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%1yFX09'A%bDb0!i(`ZWyֻΗ|ִ0-6d :íyFwR1u)X9 f΁U ~5batx|ELU:T'T[G+= ؽZK̡O6rLmȰ (T$ n#b@hpj:˾kj3)M/8`$:) X+ҧSaz}VP1J%+P:Dsƫ%z? +g 0հc0E) 3͛rƯ?e|kiȄc̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?3J+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA['҉F=NGD4sTq1HPld=Q,DQ IJipqc2*;/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ "J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@:"nKErHc1FYbQ F;v?[uLU4lZ[xEN'oI㤛rP*jC" 6@dmHg1$Ƞh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B 1%78d`Q4d$x0t8@tmy6T\YAidtxBG:pѨyeNg4]M e}Wn6i~G/ہZ*FU{fXڃP'Hd4 ,ŸqMHDCYZz Qnz܂$Jp04ȴIL΃.0FiO-qy)i>_T^|S2G4miBȨHM(2hys|F 94 Dwٞ%eOi9uk txYΖx_eɑvťJ*V.0+^nէFIcu '‹y9Hj }1fmfsQJIVrQWq> ̢t-mfeF;gUаP/ .D%EoS*;9OLRX[vDb:wWa}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+rwUjx˳hvX%e"xm뻱~0GBeFO0ޑmw(zMS<.E׻`>ajO '!9MHK:9!s,jV剤C:LIeHJ"M0P,$N;a-zs=1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^As$wTg1!H$|HBTf̋ Y@UwaFī h[W,Ê=j8&wd ԋU.I{7O=%iGtxvBչ̳@1+^.r%V12, 'j"2@+ wm 4\xNtqwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ=>"LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E_2xw7=۷͠qj@o5iX0v\fk= ;H J~.,?՞)Yghi= 7o_ ެhd+xӁvGT.+-k)j_J>ɸv'IJ-tH{ "KFnLRH+H6Er$igsO>QKҰ]Lofj8dqV+ށ,/fC Q`B 6׫i^SL[bJgW^zA42iG.y3bBA{+pR A ?IYY ?|-z#}~f ‰dŷ-р,m7VyIwGHέ 2^u^ޚM{BL\#a s.5\|,醴=#uL#]  GE|FKi3&,ۣxݍl0HڜHLK'Cw#)krSIK<1څ 9abHl:b3L7jOq͂Ӥ=u8#E2;|z꽐fрi^lTlt␚ɓW%OVc8|*yI0U=ޝ P?&pWt|CX6|,9:N\:|ҳ uE UIL`F &ni4~l3?n-Kھlۅm''J[/m$g4rfrvRޞϏO'/g]fyrd-i-Iv#GL`,ȃ1F\$' )䉳y7]ik{Bm(W F@@{W+ߑ/`xV,)ޖ,3~JPͪM|$oV1yU<̐t6 T m^ IgINJ\Оf*Z"I)+>n#y 9D*A$$"^)dVQ.(rO6߾Zw_Ȣaޒu'- ^-G;U\cAAz7EtlLuoXuF}b T2H_*kIG?S(קjhg 5EF5uKkBYx-qnqsn[?_r=V:х@JifVg,w}QJUtys^yt7Yr+"*Dtk?jc{eϻlW1X{:݌>EEly(*SHN:ӫߏq{L$ߋQ{ζ(F_E{j>3mqfΤP-j)H˧]Tq~g d0/0q߉!yB.hH׽;}VLGp5I#8'xal&Ȑc$ d7?tWZ_[l$$ibH)Fu / _wtŒ^ hgiNas*@K{;tH*tlNy49Wͩ KOAH$7}r3 >6ׇƄ%VDSW.`X:#CB@[(vK44sBFu w2p)lb8ł_1+=du).s¯?.c[;''=n2'i\y6St0X7QUV:;.1& ,5΀j:<< +OY?68In'abXIǣO;&V\DŽ0,9f ħOO#[l:h8wݣ19.L3^nY wx!桅\OPY\Nhד硋K){-7R([M1<ϭ&qѕ֢b(LF/ bSŇULy8UAO'DvZܧ} 閹Vg؇#$wz-)lh$2ca_s6DWE-E ;/tT~IUleݒ*$!>*mBA2,'JIn_cSz).JC]?P(ՏPJS3 C>![7>^.hd}>P'k2MؤYi/{!ca g/^wT j˚ب|MRpkKyt1?Y˳Z%NhfӫŶ Au8Y4>>K?2_~3\z=z}NA YA^"fY0_8`N O7{b_u٭}3&j0G[ PCbʍN^XS&}E9ZS't$=tn̮&nu [w}Ab8yW4*@`te\0zE|!@E " ;9Ώf3kZc7BI}m|X!lk҃=pnUגZ6p| G;;74^l{Pclwů Հ}xcSu)6fbM/R(*ȴd.^Qw %"=nluOeH=t) Hİd/D!-Ɩ:;v8`vU~Ʉ!hX #'$2j1ܒZ˜bK@*`*#QA 9WykGk,8}B6{/) ݆Y~ 1;;|,ۇ=sxy+@{l/*+E2}`pNU`ZS̯窜qN8V ['4d!FmaX-6 y:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4`9);q`F"d1v>ժbLGd~MP%m x52LMF9 E"A,S Vo}\"X.2< 5FB΢u.`aJ#Tk’"D#cuCXȉ4 ՖK(KP|dZ1&8{9rLnMRф%V Ng2K|`ot.GSGd oE'!B'Nb1{8LW^9KbN;sö!`0ݘ/l+1L#B8U֕&*?V6N{դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W_nAqC_oO-S_sOq?B}mmK2/@DJt}=xL@5MG0ZY,\S Eb uw:YɊ|ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI~'.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N+;fwt[wL X1!;W$*죓Ha-s>Vzk[~S_vD.yΕ`h9U|A܌ЃECTC Tnpצho!=V qy)U cigs^>sgv"4N9W_iI NRCǔd X1Lb.u@`X]nl}!:ViI[/SE un޷(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>IdA Od[jlec=XJ|&+-T1m8NP$%s,ig\Z:h Ћ߉n!r}_\ \5 6 d#=&X^-kOwĝJO\Vj; )!eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%vl x٘3܎y,< )i7 Ո: tC`\?c%v7\Ct!$9iç$><+c~݊lz1H[E'2/clQ.I`AWOlw&5fH n`gMytdx)lwAK~GgbJI-tq5/i ?WǠr^C/1NEU<=co(k0Q~wˌ\g,\ rf\PUH,L#L7E"`0dq@zn~+CX|,l_B'9Dcuu|~z+G q|-bb^HcUha9ce1P[;qsA.Ǎ-]W‹y?ڕ^Pm:>I+Ȧ6' ,}U=̀*Eg.6_~OJ/8V ?ç&+|t><,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;Ykmjiŗ%uC RoH8wFA(?ݒl Ƹ࠰_鞞Ltur$ "qwD E\"zD0F,$bwA΂20~RgxqoJrl!*Jɓ\Xz<"*F .js:̊ۓ[yoO"CU Ab~+TY[A@;g)' SQ҂WxnSHۮ*L|2n]n^]P5{mEh&뷎* `00Mm([ %Kں;Ǵz8~`붡Xo|'*tW^ro1+  LjOG'OB [luY$"`ďdV^,xDH! ol_aXmz%l6R5kJ\Yy"9F!ρ)V`)(V1*N@||2]lTK.Fe΃:_WʖzF2M9ͫ*/_On(rR4c[u^yHu2D1Ja,FqW-0ed N7TUE}K诒'aYSY8}XT,@LlU % VT (G@lC¹zDa J>6+@%B_\ZBxy4"؞Ʈi9|}9k]ӾSfpJ"~`A#:F0++9J;oU_s=w_rj. b#<uTg2ϲf1wr /oHtkg էg$ꆺ3ǭ2!p) 0 3l6zt $.n' r’xX5ژj+[Db}DQEX#&HP݇c],>*>0Z5;:?=ŮIBb;+ViyHTi, 3Yhpv7 3A Kn'qw* %$Sp\]$f c= }sz[Kf7gŦ253D (}? MFO4Crg>NɂfVt4e)jhG}hz{zN?9fJwj!j"w3˔ea:yx|Wy`ʌ<݇OQW(\EeѴݧqgbV B)Ix>7Y'p X$IL,vՉLa3i+<X^m=z?8 h`\kjT)K}IB1,3At6F B|ܙų >"bK7dn#M$&jH&T.i@vtg´֛dww֙YC" AC}@_J}նZw{ :GOD<%㢪a-> s;5+`!.cwyBS%̩0ơ` [g9'ILuKhPXXS1zQLO۟!@(u7Ѵx埀IkN IM+ye=R/8N8 *oxГBR9[pk.uܧRUD)s)7Gk'ۓ,j<$U *6>Tͳr`B)s4=}Vq{5y>xn&3E b&˃ Y}t+K_sEIOhnnC[ )!V.{IuHoO>T<Ќ҄kbIYkZYmwaNXR:Lp]p*XVs̔۲vPy'342_ǕD^n8|, #<6M+ijL|-^4w:s0=O$,B-^X-(W,WPؓqؖ,)=LMe*1PV*Qs[VrCoKk_+b -k vx^c2bl/&:Y7]KjvːUwH2  y4ıZauֱ{{-Z)ҝ WXUn+. ׆ag\יE1(Lfyo#f:50aڑGs '쮽JClJK8a䣹Z r`RBTmhW-?i N}R&L'צlidnM4DӶ$zTrTݦe g5S2zzxt-~mEK2E,%UkVW}X%۪h@4u; W>uC`hi5+y4_9]v>dS :IHYexUhr7q8y%A$¸7៯]յiLIrVWw'.*~,Rprcy!\ܣKږœ7@fU,eeʑ),Kū:%qgؐ>U5}9OyqS=4,,~չi6Y~m=CeٖYhx-RqNjh#yt8$2*|ًTeg+D̖(N9VS9/[iI"yYI [ZK_rmGbDudawJHaGgI[Bh'k(˕?"jM_XM<^8ߐ0x^SD$nةz7cz>ʺ_UW ] xE`h^][NWD.CX Ҹj_JNh# g?zs۷&ʶp}k+=jtx} \{F!= ]_@&ЫO(5U߃~,~pB;'"$=O}h6@} @ KBXv_ ]x{zN?NHLc?څ!w?Bڈ!;x,E[VG)`Su8f?|yD.)"$,rmR;:3ās.D7< ~,1HJK!?љG?aq.>A../jvO XVP®xpk@X=m@Y,}rQI"# է5aME @] , 7 b!ZG0kF\e޵P|A*"n5ON8!\ -g ?,ʯ `26nal]2 X)FhJ],%q J>)Z bhݕX]z0FxE h !;P<ހ";#( ޖTLv! Iœ##D :@XS;D=-Z Ob`| ?܁p XѶPVRxV#cx""١&cΣaު'@#pͼl$d pi'`ZMP۵PE?d=yVMTyusJE{PrP#1M6O$|zPZXn2~-UCf.CiqfP\uhoƍBR;Ѷ$s^iQ\f%pxI-ltm{!JXAapa2%} #@"a J\-%oiۺ-Byv#&" p,rk+ץekM iI݈yoJ&ݎ%0Y#u=/g!;+9rS*EV!@屧 ;`XD%Eik?[ 'ko)]J NKu$<@8#=FXt-K b!8Z@ȮNH /O׉) E„AG)⦮ARX|ENR+!RX$h՟P &!P'B),@EgDHakJz^:MT2Ǜ` ˯z40'CA @q#EBKSW<(؏:E L-5\K#C4D8Xn DI+fL $WqykL,ֹ@ W:z%[28N;~̛{x/Ѥٕӷn 0;KGNMrCn+?V^]i?σ(JP1xk Gjұ%Nl"+T74bNߣ^qD wo-K8a107A}* 4Em.wp+.*bQܒnoΓy-bW#1rtH~9rd3;Yh";pP Y2XҺ\q?)g{k4pj(T%λehR, 偳 \,*s+A5ԢdWIɤA;"2k`#\2fK;ٴaL;:pE50'Ecf+uu{'")(dLЅj5B g_pHTYqq=XUsE2}"wW'_acvHj ,AZ9qtzL)9<90A~o9l[>&\UjPiTb$0Q,ۡ:@\5>8m5ˮ,!5nJ+EzysQICi=᧼7tc8Lq/iv=Ncehʞ=>R]*) ~^, Ψ!BG=Cʅ%[w?XnFlԤҧZpkW+ߧȕy%(ʈ|?96N"C8(D-X7=^76dfb{3b6ZM+6Z։ `嫵FQxed9P[$Vy51q=о'}ʤ'9:yk6Iyt8#9|_=%m۸~R8_ng@R۫(o1 e&O0</^4%U|udTD;2*Ns}' $`T_d'xaT(eHzo95_- ,,,B"ĈyЈA%L 4GAQ à40Ky,=u3<)6AP 8MG"ydy7+F?}USzQ<{&GufO(uW*R0V&E\&7wp'5ψr%5$ܜҍo>J6p<}Fn*Nɲ7HD9K.tk4)&`bjR ^ Wa0"!G.ozȚ#QMU*̺)TTlXօݥ[I/U0Fm'"ؒG!Lvڰf[khd6ZQDYgݭeL{vVN E`( z҃#_j#/S(sWF}@Iȵ0lpz.̕]Nou'o/G /`\Q]D6Kv#l-6rۊz\.nwi]wmA0 /Z*PQ)!n~!A([%@(۝P0Bٞ]]R˘==pTپvidڇރUji#I)߁PwPwB uF'UB #TI؁PoPoB F';N0B= w 4X%4؁`wBIh*>pOB*FܓP%]9 ZYWI@'̈́(O%(Vw|'N@&|Rtta80qWE!F|vYg$(|8w 3GCF$VuJ혀[K9e[L43ZgD~6)}8d囪) S<#/iW%?̛3"Xؗh%q Q94 gB#cQuO|l xC}X2E2B@Ľ?8'.RyR:8h o3"[5C_)!^Tj%6OJ4cd`U0rއ9@ډѠ_|",a P^]D9||'r̩IUjo8OAӯT2cJ7o^lOy' b}藗qQ>/YAzxw_"k~[Le6ŭ}R{ }j0Rɺn+H%yxȏٌ!JN,`@$pos:apaa6#X=yYىwD3`)*O< <1+(gS'ukM=k ,/BנtVM,Em ?f"QrxkEn\b`+^:FkYv꟯Uj$nSOװ&e0Աo{@= ^M +D7eQzJ;!MF@TYRZIWIflk8iq3CPqѢH%VX\6VA R[q ;H-5!]THJ@c$<]*\tʦ:'3㖉\fPGgʾ9A7=kՠ6l Os$`ߚl!z,CP|X0Q Rq/,aW(eT9>nO+@Gp^V:m~ 6a3-^R[y-Tc|۵,W&;jRk$8I)T틴֍G`挡~ BF*9OmKXɡ,ԅlxĨ̴%ݛ}{ 0?^$ha]awe؋n3VV.RCd[ 2KϵEÉih'j e۰HskN8Xܧx6N(s {jVm*qC+G݀ͥzj;3m-w^rhsm5DI\U룇O,W:Fr+0!JY{Nչf-ź`{.2'j!dJgn_ h"Gx|FbuOm0ޱki>ma_S޴ U873p'RmrLAM҃!NdR82-;Q@_򮭷HyV}C O y_-J䐔dkGt"vѼm]|W?ó]u:l?_07\wüخ9ItY\MC:?'vOPu~bWIGԞzUr#^xӔ 2XE焛w}z-r5!2NqߗdXrv:nm|[/o/ZQM*&D%`^R:^2c$'$' `wf&:iW!(g[G AsŚ ߶++ˌ͵9ye3C ##7|!Ѝh- eK,Yo9Hzۍ)bgpX iКץaRÎ6./9YzOcQ3-\Vf1ZN4,N9gK{:FZj92݉%C.C+8ゑ4¸̛G DEp1,X(s1bگȮpfO2+d :pDS6;zIX}Y\ÌDmksQT&᠀Ypln]}x0b-ٓ;%W@j7KLpco8W o,1bhocq!% &N] oy .jTCI~ a WAC@fr&,p?5G|/*7nsX]s+9V[ .<2o<;8pY~ OP}x$ю[6pc-:Pk6XDg7!uj.#L2F I^eBm8JJ X*&)R21&;7؝||(W.g7H[ %6P]"iP]Ní*ާD5H:SE$8WYڔat#Zs!p2F\êAgz1UBzjREk *kЃ1mp o۷3>-4&q#0t h ]Ug!8T2GȒ FÚz{(`/|!( nHp~?VF5:AHVc-H"r6D ^Ԇ2\@Vcր["*κEetzoN?Dg=1V#l[TTYF^~ <9*u(, U8hNVQ(I]cbl ~x/Ķ&FpKTW#G(wpIݢ;^x vp\8c%rA i4e+F7#1Fq:atk`]Ƃuu}NlN6"{4)S4˱)w 9WyY;bt-V瑆#iHs=6۠"e9&K}p`ɁQ(^[t&o8ccNA%%pB`$܋?N*[1Gg''5.m_ n%f\bD6 $MĴH#03C7 z7 S6\YJ?XK){t ;A,nG NnZTvg'c uP3sA^P#5-0M51l{1xŢ _dz&юM<=T"s"*C]s1bΞ&HS5Z2R+XP.u/Xq^CJZ&g1m̷=;B8Pxl,WtN| n=HaEL}|XLVhb>߻y1Y*Ӵ͢`|o!A ր:էZ ƣKNNj{,)7ӢN87\M1c"xu}<19Y :{)9F4@$-HR{9OfQ3nTza}xس枮i$HcE5k=yҹLsd 3T`78 !V!U#B9; Q4>?NO:xŔ5̓TȌ"F_n$A_;T.ڴ^@)ܸprNN ]M,0t1lѮ{ꁞܲ^ O&{kFRz%q0`:PR.un:Ef}I3tޛT S8%jV̒ ؠEFbˣI烶時?^ጂfN wK*UL-#ܘ9˦"x!ұFkLUc H:;OGjI,UnQ8O# 9/Ilb]Z;C~!vpHzw߶&vwm܈Rؒ@͢yGP1go0;饴rC+w/ȋԕd$[v=;u>:RJSp~t؅.:L"] .:f(uf4{g66mFs>9idpSc3QgPOQXI2d3O`VCLrĝ7dZKY6&nP(Z91cDQ'n<џPGV+`-uFn KpQc$O{.:J)nas[I0c}d3$J2P0RyQZ޷n{.!K;oƈ!wRCgSj9u=\,j npQ*Rh2FIP |lɘggI7VhGį&$rR !<̢g阾7eHp<}S8uqE?=8yiAĴYmNn5NZc)ylץ"MI<G>Ș8 ΄L\ʵ}X&nɥ˻gg}oHm?x9m̎HҧDWi&kФ~h9^ECG9Z)-B (lY>]7)]SR}- *loIPa< r~Y.?T6#"U1zCM*,#$S .0r' p b_ e*J*i ǒ>8x3Rk\w@AYXTu"UI\v0K{<.z_Q8%2}}F#\/ZB,b=A9#M|خc b Kma}2 3.@Gf|!VgIHB4!jbf>co6XDϛ:5lFeJ\oeOiy@ho}jm-p,s.B}^?7OQ1\Ջ'^o̴NETgE`$3]ٓ8x<-~|ܒ̢LjU&RGvwzo,D< cY頋,ۘ0U[D?Fyw/]#j>(e#5$* yܧkdu~$5_SQp{jf#o5jo`Uhx ,XF#"hC1 fw;㊅F(24GϽ\Zx͸.W۬gb "QvY~o65pp6t#1|ElFAֆ>*|nّi׸_u`ZjZ_`AH_cFOo&Mܱ9[/[1`Um6HGQ\"5\0bκfpd+FҀQh15ٻ涍,UPl2#J}զR'īsTFC†WJᆵR D~wwwj $¸KtA[q!*-vCd>B|DS` ;jdnfv|]ǽzuP칌"V/ғ?;q±j:P~;zCb [%c֙JKse?:3sj'YyT)4ae';1Mm; 61N5q+vt{c8: =XQDԬW\?NN R5u":g6W:gwYERU*^db>3wep=QU*G:j9ír)4dKuzn{&z~U׏x[]ck*R~|Wn|aԃK60TI֗[K\W07Ep/-)Df29v8 DJ=dJNq uWn0(Ɂo-'eeɏ?_#r~7| /j++{uj,˳!{fұ듃pP -U=VU$dǩg^Nc^љ` A>Ifz@[ '~^9,Ivo hF/&h| ?`8{'oOXVh:I^Cs\eKwys)\D`xx)Kp ML089pFon,g}d96tx1c<>9+tL gZAL sf0v7{{*\w7Fp9ܝ[<14x|/2Xu}0\CvUq88q~uFYxzY>~/g}\L&{|W0 #0Pvc_ȏ0>*%ַo{Y?p2[ 🠃n<&vQ6 9R~xk1Gq.Zs,$zSU?od&pWBX hNfphm.DȐ-W>|(/t StwA{U'*nw<=zW5IhgReVGh &wx`a?NGo}<2Wu> eձz]lvnLbTO/lILLFaK fn>;lzgt/ڥ6v~ӂ"Wn4W)FBk$Jȷ;7'Mye tzJo]J*"69sSpf*x样AL8Q [1 |؜2B d_K܏lRqbUA_ El쒳B 3$]|X,dվ)ƽr$NUߥuZbڂcA(zwB coJ"b kIҍ.v?Nw_De"fv0A7Y:L7.7}s}n=7 Ė9i,w;M[Z<.J"ƅ:DG0kM̓Kh["g@LːleHUvj<Dif7q (7q$k>QjϘyȗ|uHPڂi5NJM7pʥhxg*kg`o$-je+A*&nPEQ-Ph ][k8jN+<.d\}E'_sDW]+ J'PZ. EbNJc>]VMZDJH$$$R=r@RIy0MT~Fd&>N:pJkSRVq{iPs{u&3Sj5&90zR:bJ; Dar`8QnOxpl FyzP vez' ~a љrÃZ79{`q,~u3rTPs0 5JM`ZWꗭ{ +8ɗoY5Ϛldo"]oZl05YXՒ(R̝@q;.CpմFD#ڌd˳ m@sY9{E{mH<"~/ѐ\gs]rtD ׼z U۷~ިV+U\8 ZX9aEm`tIą~z( W w2n>vq/F,<f$8rڒXJ?2 |fcHDDq&&F'\~v}4Gi6HQ8a؟[ ^9xUC$Y{iPYRA9kg,slYƈ LÕ#q sD" KΔ;2gg|s`pƲ{3b,5N!x >22Lcdc,uFҘۺ*gg|sU3\-c0NA anQD4u DXNcS#I&Ȓ9nG8nsG,T9Lm)ÌM9KEL *0pWiltTMvT3v-l\n٬m٬1bXi M_<5{{j5nDOca`tlzYH`*Ūz;;y @ga<;sN4{Y~_:zEYOc̊RAOfqHm{O2#f*F󙩒:̫٣FY&"H.]D ӫh<3I$ªY#xӫ v$ Z$uSyq9g`>qjA ԍ.DG<*`?.=4s+H:GޥoOFwÁ G71nL1WA]A Fe)kրz6 {{e1P a ̀+pxc5$*qȺQo[*2 $,]]KKwL \|Kh. f@K-4r{hntYv@4k*6+2\YȺiZ@Jі|0Kb;W7| ^[j]C L oy'ۆYMrVS1l$-4Xq3~9W=kqw+Ўvo5%D<+q7 rg&k!$jE4G> n<4w<9A?GA!LG">opÏ պ=!XiIDQٷBw?f/o Y\Rt-~wu b/ vC+5zfr=WM_{c1o5$ϯp] |n'P07MڨT ?:DR-%Q?-4ʯP3;,z#v3wa |홉Kwhr͵ F@fsa)8ah^5lU%}PhbdД!)Q{mu/ O 🗅LgB@$h3mqo7W܍֐Q`VQ*_ J盂_2Nxe&y)xi%KJKK޻#lxWM/n"4/"aH  p+$TT3'MH̴)U,N^yy}qyY5P/u=cTzG>3kMr|\tj8@[wKK+坒kQHhZ:%aZKXbS0SoIށi@[z2/j#VG1X d׏]Zd WQ_ NhD.lRTG/ǚ>f84%H2%+u2tHZ.UȀԊܥDlI)-,iҵAR J' b"Q#Uܠ-Qhc*5!pՌSBR/Ecb$I&ܐ8;p̍#PΟF$:R(S.5` V_3͉@VXqTPٟqL#QXTSYvژ(㰔;PS n7 xTDZS4\cF %`i!R\r !64vi[W P%nnu{?*xK~{LSN(D%qL4DS'0t WcDQ@ZrfgBlAүP|L "Ǥ| YZheW[dEA K@:U'kAErRH^#o>ﰐ>J32tN) >0A a@Z=koƖCDҼ-7@I lRZQrPJlҴL$p,jH999KN`zcX"4}NdKR1 UT 5F0m:^ \FbS`M2y;IA}9r@¨S`J8F eE-)Wǯ$"HIGxpUkTB]zqX"b`E0VSRjxB$\[!:Ul ju`Pq-@bH# ?iB6 c5H25: #$VZZv75I%: B #Jǭt͝a$rMVu8^9iq `Sf Y@v 'B#ۚ裨8UE+Zz,O|'ELּ5L bô 3t_{;)eEBSj׷X1zHvoypBڜ j:ݳN gͯ:Q^tsgFIWeFo x-NA~l;[?: N\1ђbL{SD{=D f27[]~XQUH”R"1akkZ`5kS *`r/ڪl:0S \p` MsAi"+[c!c/*\Zǩa#Z LHZJeEүf~ ϴ<Ю;mLbKZD2 P$$TR'NF`:FXS}`!v'Ji!Xa [|qS!H곛V$Z=&[ $1n*[fI<&)թ8h sQTYbEsRRpM!i3L0)խ5eMY\!ڳ~Sq(xA2n`VqMRm*D!I39N"JNsHDX6)CT10) KNR08ݱ 4nN$2 aѳ8&HJSEVH1qaƨA^,b$~C ]J)Tp cbI}T3b㚨-BU}Jc' )xF k#$шJ ^Hb W9V$&8iS.wuم:SЅ醡t푉`܌|մj$,AH(ѸQ `S \p7*bN s"nݗ[ WE97p{ki&YA=QP$8SJ+&\M?R~c[i)Tb-B99X qNq3:&'KRЂx&P8[ j*I0iV`Ihg%1ingVS ` qb]ܠ?m AJ+[ >nX+|6-A J ݱh?W( !G6G 8P# k)9P*I/<0w0kZVW4|bh9Uq+ղ :T]f@@rG\r*KNE!G\r"DQA(g gb7yn" |ux%pn؃ܖqM 4o 7bI|UU0eh":m:1/}(A>pGx岟?}0L~ /sXd:!z>+Uh#Pwu.Ye7͞u4u'i7$7/ШrM$0ѻ]B%K3~=toa{G@Q;NQ>- $ʻ~]DI^ׯ8U -U:UXҦ;U5ܿLVJB񦂐 Rh~a%ۥju.Vpàߋ&ʟzM1֝N`旗fz4m詺9"ȸ:ٓWR#H!~|D\Jꑒ筎k 1]4jďx#?ESM  !#̛j;W5fdi5yӵe&KIɎ#o^M~{usik"߾;;7.Z>`F[i655DA~>a~`|Id3 q?>d QAiF?^9{kAE?@k}ὗٳAv~ }烣^%̢tyVkʟ0XSBFϯ߾*׷FrO՚ tSd-pl=QhF rdr5d@_?[XP*6Ct;@Wb`\y޻@8\p$lN@EK?9$qA@݀_x23Yl0R[VWw 4G )XB(Ӟ]Λ"}˰xd=0o 7=19%Z.YSHwqQfc $dz`Rh6tUC{*4& 7[.BJ)bƎ:^"Y,))t)?o{x6VHWægo`aUS \Iڢj|?b_#rc\h6~{j4✟ꎱ!{?%J.L_G`n$2_ gcbtry;E6LFY?Wi6LIPL [7v1Ӽɼ4fp4`A˳ Uv}X Jr&:^?Ʃ"s}%;ʻ6cY S;kK3j}ଝivGک{SXEgs<cU,YW~bABlY%i"_>HANd_;;߽ lU3K!2mIf HYolj3PƀPy9t>z>TDd-喋/(}QkM8sO'.yZ&KwٲU@cx.G2١ŧ W^ͼau Uc1?< ]bfe*r~6KwdwZ^^gqW2gɅ8dh lc/ cugӫQZ)7Y[yk/` ͆Ϋ_5h(0#rD'۲յv{[{`!F x8FKv^Ƅ4qBm0 qa3@hڠai`/bϣbxuC7šudȷ3kfW6䛜ݼH(sQ^2X|.hY.rE# 8[7 odH>c}}#{ѿL݀ @4Z$HYVu)q-$)Z41YM&OĦdN8OV'v8đ.(=3]1i ŴoeX'u&V<V=^$"ྕ1 :.HwckINbH,ūp2e[@'<ȏ|hʝIpwI߹%&µNut2ui-)po_w'1OZV J ڠ/l gw6/pnyIŹ4O-Y/zg-zOKqP9k6OD/Y1Nl8uIù?&7F З|O0;,(\ hP|t=<h|{fS*02FVbLCBZKn\M1Y]I?S$ިvq4W@* =iP,[LH K2-஡m+^z գ?*AYxT )!?zrQфlY\EܬS{)@$vϝ>ځ+a>e"~gpz.u8z<>Q3` M8 EU-oxChhߐ}_p0`8=xF&}qǬq+oV_V"g%Y^5Hقѧ$ym0oX:y'.Zq`x+28/ssPMub׬~y 8Qi)d~։@*e6U"<2q*^' xxX/}n.#5m 0t8 h'q?BN$yaMl򾞻q4q>d:/^m HN{c9m~_'_Em!w]9_ Bzq-z>Am`Qon/y=/Ev| !k3O~bLY\,F4( z~6x#f6/b lůWfy!bomWJ|{Y!jXl [cC XUkF26" vF6g׫@WGZ2ڇoW'>t줍++hJ<ˆB" 6Q&{D[K{G9Y=Z=[|v2s̜$_G-3s,'gC4Fd,sxN9Y<'om0*8ja&?r2ΫYF ^P6LyK -4f./s3Ű5^M'%} Ɣt2)]Is#Kn+Mܗ9OcN>UEl[(,Db)K飺@ @b!%g%:)GFd~0<;ofAnr2&4ޒZ/ %PM12"9r\ȥ!v9#4(DHWXm* J,kGt1Ckb <^Ek`/k\6DE #2cx ǑτrU rܻ$u= ?OJT""$d184/3~5|. ސOWFM!YxF$p]qʼnWC1'#2cxCBnm5Ȼ,#7?*,NR Qb+R,,&Dބ(:Ysϯv(fI3V Z1b g NBNaƌ# wh\K Gt4%)xjA&S!*kKUI1N΋Oi)"8wxY(&a Ȩammw2Zڑ~ # 0|.Ut_lKdm5놕8I-_L]o6 t )^u 5_BNdDr 83V{[m-L3Oə]E*X[ _ \ATנ B+mD08ء"`kq 6X2%+)+!APqsA˹1J~_de\l,1+f‰(SdDF<E>rvvOxa5ґ9]oS▌uCg|&䢢W(J8 mQJ_PVΆH%\I2n}q~jclU]u)|4G-c++n{PS4˹\mWW]2,I ,39mp02"9rLUu}D碐,1uDȉ՞hK=yC :PsAjuzxE4ִv ~יy=aRw;\gˁu*DR)S+7%L$|h^uz(Fno$ۖOK3QNw[t銁Lݫh(-IYDv97bl@u}`c._' o׎jJ7p[iazM0.aE2"9r\ȥ]L'1)AԆL'tP@_! i`T'9C8!% %w1Ggc8CJFd PyAτU|x<zr|Ӆ~ 5OVz*s*g=mC320˛q?osB/xǷoc3v-A_۞ۿ]@9oKz[*vdPy{D*r֚`%<2 Iw`ƥP<{ԇ݃1v?Kvv>ggtO}Q~FQƗ̨U4 q/"E ̽~o_ϝ-Vq7UsDFeԊ%NcЯ!&ChPiLl aJKLU僌> cV" ].~+4rT,̈S* }7 YAp!S і/WZaJZKsDF(D?D¨Ѹc^0cGB5*_T9sga؂1t~|qQc Kܼb*01~֩&8jBШFqu`pӂ}(0hE>rk*ċsjѠ4]dezb $luN>BFd|U |6j%t6X} oCIPkZpJ5 |FY$ӮZg, .<=-K式! 4c%/>*02`]B@7o3b:ep <#͂LJtcZ CN ,)0tS4# <65|.bz)HF|KTs-jT,mV~י:n9 r #w:8P<) 'w>3r١b[8%S0ī`H![㎿3~.BτW"P'S Z%o$9EO? Ǎ?y?mw~uWT:o/BrDli"5W%;0S$r)D\3=2^}U=5 \+F&R&ngbb':y`1TQ3mEuTj-c[Në$r[RTL@ oM N ͡RGd4dl#>5eȵY)奖jg7I@O;T:hQfhBkZ.o(&TpEp3 D[mPh7lUELs70߹$lai*%qni0W1)[ E= m|cZVe>jt̸PqGg9rbc6nwdu*Uv hy!sf|6O(Wip 0b4-ʏ2^y:1s!y4+;! <1"\ W3\f{}~ֶTjtƂf$ Smp0ƽ)}o"C\+_Ez1ɘ"l;`eK|dqO~ p-wbH p =LQޫH gŷBq '2cxvB/޹|rb\I7t\6DV>T'n.h P9Y4&_]ow]:bzbh8Ɋv=jL  @:L*BTˮڠDŽ®ԆLAu#>|Hy u= IGع^Q'9GHDZM)Ө;%yݔ)#ӆ󑝪x5`NLExW>_]GN2(ۦBQ#! >SxͷeAzkww{_&E!{4{W㸍8@8MQ| ~K)RNm%O!HɖԶdm;4K,Y"hM)dk7eܽLW(jEDG"1,E*Q& i'%RJjN2S5&rF7rD~ )XbuBɽo@Vch.5'>Equɂ6}oU%]i|Ot‘rԯc:Y5Ypz6 l@<#?HN@`SmgL٬z-׳:z]*HW9**w:W Ǻaj5Er$R+ n@4V0Ut6hCo3lB\G@gsvfjC_ӼDOCm?9 G(n5?}`+)K??=XE.je([˱WŕYmáIf)-e'R#)y#!h-:c44Pb^C:<M+{.XͰBrˆ(1݆wtMl dsН}w3v [8"ɥڵX b3:oJx:"{L)oox.*znZ~BVw-=k:|U58NiZfG8ZzRKi@Kn MޗԎ/F"} 墷HߗZI/ ;@?Z.GsO'$_*:Z>'8_RMOSK2c:)H FQ&%^౻=b^eOgNyp yҗqS}7oAÎDz9P9FiP{ esDR:TN޹}t=k ^Xy(K)w֚=)b?J=J88~|'̨ԏrX;/}@Fe(IIƃk?MIp K;~&qd* /6]2DE)&wrIx\xpwYD*I$ P2'.ݻ8';p 8pt6-m:b |?ݝ,2ƒs @ʑ/0|Uy_9r~ HK(?-O؉!x\xp}-O =s.jR{j(P5f2o #o͘p4InL&JsUY;뒂>)" V̕ƂŜ>W僔 9W x\xp%q}_,z ŃzX\e,?P.h2ҵ(˅LjQΑ E-G!^_|JEGBW^D/> _t\֪]ɯwn^ߍ֏LOrR'>sC:Sv"BtK񤗿> l=P&bGLI/dZL<7ob/Jr7*NgӿT}S~Ы܆nϼ=o:LI0V>3Rv^gs Oi'}/G=wBDkg st pn S뛼ag>??W>~[} ~*[n[•G_Y NYQanS 'zgsQssߠM?;;DR58e OA#Y6|~޸f(߭y7}l!ث mw/Qzq3YC4yze=_&6䟻闆',?d(Q踿t\Fy8}37?U$26m_OA}]8d݇ې}MU>^ZT$05V*7|<5NPʌRciS,*faF};y.j/!^?gOB[S%ͧi7e|=Mgڈ~<ݻtasx҅7 W?zjiޅ?^˸L݉eVτy8ϊzu9IO ͷݧu뭽RRn4rʬ4/앟u~r޾q^wӢҽj?AZ#}fRӻVcUg}Vϯ2 ߽QQ7wyW{5e\jEv|_VξlLfA ޸ڇY WǏVh1B+&N~%Dx{{4G鷁ź:Eh|RfsҷaϽwa&ȋIzL8e,j;WyyѯE_2)=(N9dW_S.JhʹW|_{j?-]V7Ե}Uaz|Cx3ZQjR_S=[{ʳ XJCro)2D^|$: j Rl"!Be}_-xh ؃puH6xAZn=`T{I{70B50SgЯ_z]T  "`oX2!$ 庶 7AZlA; HK'ջ\HT$_:N¸n?st7B5ҵ(5M/6Z~\nʯa=x&[&xP!,xm )MzJ"grһL5r$ɜ^9!> -6x-79x z +gHfٴ(!#".,b$Ja 1~ƧFQRY(s z@C=g_N20:Me9CJR A(Mxsi_xY=Ojzb SwcF` 5VKk"cysmb3ޓǛB|1w>eD {>,?w::e2bi/hM ۥP3]p,ajC7#+AWMGhJ)$ I+mJ(@᧮zxܠFRDZmϩ|?guJ"aFH& 9BXD3͑$6D&1,N1{tٸÁ5ۃB<`_/ 與N㜠܄!yF&YtNe q ӺC0˻5jŢ,J#[ T9gWqi+;\$ьN@r;(:m *ay\p?Ib(Jxc, (' ";tVOj*ת2bOݍQJ|`DkE _{Q!"ZwcD#R \J+ DZIrE~5Ň;)oڔ7b1 DzAPjoCcK=J:0td pOaKNz: i Nu4aD #U.0T:繟ފ4)IImhMu 1t1V"v,2AEpIτ5yF8GYc n.I <+'mGsx~9,mK &ӈn< <8IPifZT3NX'VQ~mT0X(S1P~EM'Wv]$@b߆/'w|7'UTۖVMQ„i6|f񛪰M ԦI,oЕu^j7PapF6D Q&`Ke Erx4gHV5VB /nD/EDdsCRȡjʞ)6&#DETgɊj{01 [Gۖ̿7p%,)mGm Wt9&(ήEZ?wE.uF@ZFܱ%,CѐMdY7 gL;_0<-z{^oASnab9](܅ܧ3M.13 T${4IGtY2\O`HQJȸN [gDPV@m(+xgǐ2?iJߚ$c:<~fj"h2aQ(o<-r= nAkYByFF;)x۔Pe )#6ZZKd(h7;4ͅ; d1hr̫*V#MIefh{DSQo$_+$!`'HK:LIt"" IbN18كs%76awg77A@cH滇=A<Hì*cebNaߣp]5\2-K*21^ bx Ol,}p'H&i&lN +Μ*f!buD'!>18Ҽ(uYTneë%YKb~lO#}Lz{Hh>p:Fߍ ѥ:)RgzH̞>pMQ˖yȋ=DZ7s" RnTS8'tڠ%ZT!??m5VB u7$%K*w#C$icDc3qeV~A)hݍQeI5a;(ʽ 24&S9MSxV{o5ZSo7 1Ѻ#!ͱ'%G6ÏudCp@iGK.ug r3(fgpXnہQBU@PyQ٬Zg3T?= ѝ]Ҧĺ%E)Vl[OAb2^jSH1/ p"/I*J&r!-F8e޳6r#W;`w{IȇI˭dmFy,oۭniKbk&A[fwO,f'v)8_)|)3/)2j]_@g~>a81-+aQ0)4*3Bxϧ{j~w-8| 0R&c)R2 R$K TLmIzd})?Vb^$uC pk18?:{>٥d4H zS.lˢۧ1.X^X3] վpҮO`{Qs>h#_ +!}BN%rw>v}SsIڔͷUkza݆Os1?"z.*J'yYvgK[_@CRAJ@TU 9<]yoʃbڨ kSЮgN t*ۧ1GVC՟{ 5TfC:k:!4 ~ކRpC=rZ$ ~ Fx:nó4Z(jôWeU 8U-V1HLfB]yu\  2kheuV,u)/TLMUe/CZ$b=#B2,,2=b&dV,Xt}re`lRTHc:{}`X˴ R<%̅e WaC <+$u5wrrM!6id$ #}qo' WDV0k%Y_lg="Ac0ah-{<J 3q$1i>Y}C81wJY`P)Iɍ}a2s񞉟XǼ146Y]$ æ5'L)yw4(-,IrlqE T=l% ϰ}@@-cbT%@wv.T[tүz{ϯ;kl >*[3$ ?sW/۰( Mǡ:ס7!NTMpnY>Ir~nM۔f  ԏ7K."WoG?G6(Ĉ@x Z56 gP +:4K%ȈzFPCjT$WFxf{7FSԧ42G诿}7\NbkwG|63ڌ.h&л\+:ڕ|aЍ'o=bkrEϑv۵N2lfôқKYb-K@{}t4~fJ+yQU]x"_`'$W{W*V[,AP Xq-Ŗ=R=Ӳ2wId'3w*+_|ߠ=҆|fze. )7!LD`ç Ϡ;.Ư'y-{ؓ !@ býC J_X4vRy4BF2BxlnJw,-x*)kÁ@1%|I?V-Yt_57#??hʡeFtT݆Ѭ ͠c=3L$19H\~DWR^9n1u)ֵfC/4,Q!s%p>nJ%FUt##V"kI(w*]៑mkBuCpTNB,#.0ANPR0=SF\, `LgZZTzXNMh՟.{`+<9qa>۪m]_gTS0GJo7g3r7ҿ-B$I܇y$U x_ZylZ1py`|) h'Oea/W .vm| ]?d'TVb 4z?nA !.;$lvUK}s/%аΞmkj^o~ eSG.~gUwg=z0kqkolp & oRc3{,% 3F+Vԓt_DɇnWQG,|ò a~O DꏧVvFKJoע ɬFoCIov1@9FOa:p o:}7y26^ Zs6KMWgf](׀oa5o|뵾/Zȧ7r@Wb-Al -ޱnioVtҼ}5h_x/Əu}BUӷc$LӮ5ϫKh[*WKc^80:Av/{@0+`iUN&tT5nTR¯'݉qv7An~kB+ 7Fq C-8(QU܏VAč:%wKCwQX֊Im0 S sbwR> NB2uw"^:4%! BQ1#r<*w+kpmE]EP]4/%֭IrK}CJ3va6U6e $O܆a}5΋O;Ԑۦ^ #u@쐳"\O' fc'mw}TZ2 Ikɰ,1X\U!R^,KT 1a/8)jUW{u.{2 V4_p /k~ʀd΁7R\ P3D0'2%Fn.tcA[vByF q( 'Lo)@f\EԯrnrBE/803.*J,*ͬDX4),[]Yo뀇jsrR;[hL1EI{`X:dSt +pD-VBm'tx KKYlx<<ڻ,QO]qWPe%@bEY@vzxlj[kFΚjj1%ʀL}pzqV&5ME I;Wq81Yb(L FN+aKUƄT_R?%mX\wDa1_`^ @{W=K}WuOdNO9­z,{]EN|0rpF-fv|:y^P~NS*/Ӌ annkݒ^I,;;@i)q,r}o>&k tB{Q2y/س2g.s6g˜9 p|LR 4/"Mf}}1x!) `K!*1%'\.J꜂ ~M92)f}}0;_gK?۟'o.~nϜN3}' ݵ?Sx4u_ ?7U3 ˆ&2{s,t fCb_^7eG 5F uak.Pwn ~Q b=wW{h"p^"cmRcȁ6͵mL߰SVK/,* pMJ1*Ja-#ʑhX---'V{lXmN+/-6LǑե\˫c |;4i?Qh7_

vfxz.zhE"Rl/5~*F NU<R!Eu[]H˧>V`'|AG Y˴&ym XR2[k|RKoHt>P*ȝ؎e4;S׮\vEڼl B!5X?)F!H9k@&9N,2Aixb GǰL- rXL@ut1] ވ%;U8p%^fӺo kedl/ut$DJ !AMk L *ꈣt>$뮍z]F3w *UVmC8q$ZQ JQ27|t}ܱYE:DNv<9VXOOy˓18Iq;p8Fپ )YqgTGSa#-DyYesc|x)! -G1V@XHÉֵu'׿+f&`:5`QiGGDh\˦@;p8TbmTO4;ꄨjH_g՗dϻ5ho{P$ᴮƕ,9 Ť R<[(>cQk7&.]Qmm +- SϥBG1r5 - TAXf"KX)az{ZyI= TɑF =S q) /Eej( È D 8pZ`;lI]dy#is0dHYgD!I6o0.PSVaXJTani'TIx: 08A7 )jga9y0Ȋb!aT>0',N~΁H:'% #kAH xt ;ϑ0GT) $jͯ&$@Q=(1ѷzY@$οCC> ;k 5b+ǞXH٪p0rCRbR%#['kpPA0]qPn4rvkݕd|odh33iIet;\;K,s`^$V>'0V*[y6*gc? 1 rҨ=m[ 3뗖>lƲXHapBc+4[%!Mjl Ee *#)P;T$f:v.!ew\%03F2:rX 逐 Nth,W 4];ɰ&v잰$M|O0xێ'vk Āv O_mW۵s;O-4I9s|d!!`+zzzLd'rx,ASd{Ni4 ǹ.{fg({?y/l#C@++@3uܙd'ÆcsgRW̝1g[]ϚPp35KɄ2Hx*S1KhLXI:+Qr<Q+e*O5J4jw%mi0 &_L g Ea\q@Fap2D6gtEwx`sB ?0eg:@RYۡ 0!΀sXU D< n yaQΘg=5V2 9$UI/~c }썲Uq:svY۴ .mnyYk^3*p0VB{hzq2^&'vxn6( N=lRѧ\U ,s%FK#7!3̇E" ( pckۦ#9V c!4:6 F&W|ҷXmʼ`iY}>l .#<$]2I( 8O*09MO; 98YRϠ̰k~h #%!1udKdg`B=2`pJ(0}4S hc3k]zGs*JFap |&Gb]R՚0Ai4#GjUv/ T*.C;Xώ*V2 ) O:V'= `yE|gQ0հjxޢc-0,r}ZQ +XU25a6ht@Fap8G|~@!cI1g@s@Fap2̾T9\鶬{ڢՋwdKrܴqwhW){Jצ9 082N()k. F$ct@VuܫGQ]`tMwU8j  uQ4] 2LќØ1t#g,%Up&Y}\`q͑g@)uo'\Q*$|2ʂY2f4B̧eƏ)&3dWNa2LT:)g,y3#) _X2iB~ Q_jW{+ VF"GG)1ժ8m}= %-gQ ]-s{?Igsn[axڴv!VON&h9OKY'+IM\9v:hҌG0Q}ӥ;b.2i3TH1!v6p\g" C#3-e^ZY2#6 bk"W"6أ)ͳppZ T XR$M]aG$WxEIXHEOkL #pK9e􀌒Hc\2BH0_(\ g1q(gwo_Q ݠg LϤo E0T 3HMV{#Tq>HpaQrÇq- q[+6+v:ET?Nf LWo_6WݼZODT ߽zyԾy<߮{7S|2:;|dmVGlt; B~b/+y㲺F]\}F< оm?YqmtWnǾ#`NrejG[r}&\ҿ5nnmᝃzy߻Zx?Og?CW͛mRvCn<ᣄOs|_66'7PSZW/^_g;i(݀m|gcbtXÛ*[l̦{an0oXJQk9O~ %l~ ,& .A_͕&Ra6@90f.(N%֞W` Q>P{mgA|ofK_at pV()5@{84$0g<t]k57Hъ BnEt/9쓞|$͡G mu1Y?_?GO>l+ÿ0La+7[/^mT yX > +aye$+rQkd7/]ϕe[9z%,& [1cǵNʘYZr.*+ZW5uNTIYM_H։$빢}`w>U]Os+p0=?j}}ʷN6m*W_(_?] +çO %s"ΰGF *e܊uߥ.#\ޛ#/ˬ/> /l]+}!#G/{~ xc ^e0ui"'Z|]fk}3N y!;9eٗ^lGN\|d8Igzt*q~>ӄg UOո.=bUtxe`jC@cbe1j$*-_#T{OirZ| L(^3YyU'>nbR kR[wUI,jV1Z[Nl{*S[+X?&p mz<=5ڸ׽|34jEy_^/|9^۵/'?qb*Ş0'Ť[E0nۏ!a6jԻهk4ڗt an*f_ޱg/7G N5C{g3|i/+*UKcH1&Q\_.n^~Հj@h.%5=Q"3b d(ڌ6ZL{7p{Tp*~oV^hϿT`mIal26VC9PO6 _>:_ܰse)Տ5WOV~[*#M{\00^hN U'}ݿ"+zj:;>w׻۬'%&ZY,1y5Kkc 0*pnB?nfWQ/'nn~Oto<ŽݸfW^]w?Д1,^E t%V(e CVnX'o㰹_#wί,;q-ޢ@ϗyayo[2_=}V Y 3eQ4ӓR°^;9;wb%AzwlѠn+LW װ3p Vn`W)T$/ /z}ėև*-[;>EG?'7nmVMS_.R՛RkiZR.7 xy7dz&]@egc_,}"'nYXA=bd$=@hT ѬމΏ3b#~<45kd+8 c|#ࡲ5:R~H:jg ~ފ=FО4U.IZmZ#skI_5m$TfBzJ~PBD! DVtjen?oQl+3J7:놫f6NoTcݼ`5+66t3]Diъ+̸v30N(tLO8? ɫO!NC tXԩG `9ޚh/p?^ɪ%tTNPQ30Fg~d%0c 搁P\޼Ͱ (;( ïl~lT?FBø0Jb0'"FQExpŔJkL Kq {0/. I&i˓&wi>730fZ->BΦ~O^lٕ|fs,mp g`ii['84DQ`8 $ A{nYC̴Z8> H,$u:0_Ar:C Wi16r=PwSk(`Q֜yNO"y x" [ȌqG M5hrf xnK|-w3_C։D1bT%Gs5Ie8q0_AiMkG851ђ%"I\z<0_Ap5/8I d`Hޚrmy# ԰jPA2 bA‰$*(>HЂyrm<(%/ ƀ ]uB:ţY5Z05+3f$rmXhqu5Z03*ys vbJc̏Shz@*2TR%[Z0ls"9rR(F<0Z63Μg` -֪WZ:Ƙ,ZGp 4?_CFR;!8$S-G5Z0/iޘ-%gWvH#P)bPh<.O%AHlA|H8"@#ؾB inT1SâJ%,tP|  zOCK=1+(4`ni>h0 I3d6#D%)AшS=?_Cd^;_X\p 6"t&lTv1P՚K*(@Vϱ("d~Lq}ZX:V{I ֪kJi/UA8: R`libB$nR$n%:k(<[ݜy#Г hY)b 'ݺ1ƘЄyђu1nؿꮄ5_Ramkjڰ9Ӹ@KkR2D'R(", o#Ąk:g&!hJ3k9~-iSNs[㾀JK}5ew]M7w>m̫;ZJl;U}<,oh  Vs"1[ -puiJ)g=p W4E$ u&Ph<:5!9=e c .L%21+(`^1:d̬@ڊ͜sR(Ђymk敷(jp\GOΓs6ő5_Ch7bQ)y$D- XrAcZFЂyKUk(&y*9 W9Ab:Ѐyt븜({Yk@DJKZ"b!(cB m^Y9m98ybC)Čz$_CE[3G"`QJ1%1c:|&K3"DxJqy((U㪡ЂyM (IN 8-{6{MhPh<zgG.$)V\!mfi#'k(`^nm}f8WT(3VW*9Jf3Єy*RRZ QiCx095Z0i]e9j1,AhQQaX"UN*| &+hNY .VGHG| ּj7)XCX,w%0C,1`0l&@!JLeHTWCM|NaDIvPVϜBN E3>˪Ѐy5Ek=o5뇓3, >x9#P~{7ݒm+d\邰.E:kk30!+8 3k cZstTdu'vϝ؛oNW7@-n'W^! T۾Q<_ͶyGo/}ݬDP6zە^:o*&WN0BevoBdEeԠ8wC;OOk6݉~ph)o;^mDnjW%=bmP,G}AН0wD@f%LP^0\K L >8A.҄Q(Joy:یf*=t!(RkRbhޣ q#N*&..-wbO87ǻQ͠[ `-~ ;B]0]$@еˌ$Nۇ(xq}j;EwwM5 H}dY\Z$ؒ XsFД5um#I+(r!*]vv]xSdS^ 3M2W* J|@(Jr+"@`{2ZfpvSaMX:"%0d&hfJVc,ek` DnjB=wr]2VrlcmI)/[{B/7fF+4w+M!A#~[!9rl(BQ2`O$WYysc9΂!ݺ!_nDC÷I+Ah @Z رmrC_Hs(`dݗ3T|rͯЧyǓ<3ߊw?sAV^-GtV SL~'ySy[F^8`6tӳ-HM5-U9*Y^Wj ]hZ]LN4ΊJ =~\jJ*XrU2#LpJJX"Mrs9-\^B'9fa:&U_^Q1(FQ}-ӋNvwKXlK:8+}:\:Rzi0y¢ZQoޘ,L?ؗʋwIU~ ṠlUSEWsU,T~Q1N/XmME7ʚ2NWYZ60 ˷}&LCW!Ap1J%Dト"5{ 0~EwJ[Svfiuumdmܥ$kZRnS .<Ǎ|v9H_޲PnA6,sEu{)Z+ek-%ViHm,mEQ6ɟ ZUJK,M:RcY ^Hťl^H>:5Vb'\Jow1Q,|ykr:Deзk%i;ճ#v}ա$7"hM91^{o0b1 Zn,əVaĕruG4sYĠ3XyLcCڛ/cߝT}6m_ɘ[RNlj$gB(67&E]tLZI 8PaPbU@ULL$|Ew}yUƝrZd "(7RSt^ۦӔ(oaR9LDjN^L'CAnc'"jYjT<4aA~?zT5.[m}e g~޳2\)cHF3a;#qȼ'ɣ_>~>ņɧ8Yۅ({Ozo(ѤÕgV0)sΘwYdj24[e^XIN32srs4kY$?z3*zC-.nZOOoWyhpBhS#"ÿ\P& 8rA{x™3DJeL̤F!EF"Ō{,e.ĵsޠ0b#.Lx2 <.RIpj '^Zf8^碝1 MD6Zd8 Z ,/hAiق%fVs~p[-[3R׷.xҋ֙3H;D }V7YVMOAO7(eX cmXzdqXjX[)):<#G3g %>c83 fP98 i48j^u#,揷f7{PL!<". FQe *nkWV|k>ci}d0ǜդ D &G)4OY#it;0&ab+_H~zt+(>NY:m:B*@M @M @M @M @o@M &K0n HHHHH@d&0[50[50[50[50[l*A@ )@ $ϐ$@ $@*q(Km དྷyQ,VphSC[+Zrvz:9rsv+͎)v!<: nb698*l( /T<øNqJ0A:ySO8 #t P҃"MX/.݇(T\+g(76RD?e;kˬ 6=SFXp[BuCp]@#=Y@> ϶uw;<E{sKه{ |]~Ѳ3o'ƻ}LRP7ZpN9 e\/óg=2Śfv6v4;-!ZO3˴MZ֞ Zj̲_jhQOWb~Jd>'㑻/d BM| rXb .=d_Viovl>oz&[5ZOOd{h:GȼҦB|.MZ+@ @ @ @ @ @ @ @ @ @ @ @ @ jojoj>k1S7jojojoHR 4e@ X N7P{7P{7 1 @ @ @  .jou@ @ @ @XL\wlq .o["AhJQt>o G{Epᨚ<#GXsTZ-hGlm;.$zP9&+у!.醸qCE}/7Tx?\w>#:=`%ָ4Hmzz׳z7M^诐!+mG"6o]; {S%^M>mѪ2o}u 5-DPGE'wDݝWD#ω{X !-~PJ4(KOƓa6s'|t1kcNu޺Eihm{C\ r;2iEQP0Ͳ_ ⩯"6EWz]$ sd&on (W94Q{):ALkE:$/o[c `ىZvrͤU "l>~ߓd:(o.àwxqeg>g'$!c[kMu S%1=,#fSDRڦoktihAܴXSap`f_G&:pϓda0ߪU+gS=,>bnzU~T:+$噝~ub,gjRuOf;$O_5aqJS3D9ӓo,hjlC괢OxTg7kDʶP=5&[A& vWm{."ۄ8+dg-Z)V[#pƫ0eL+eu0$y}ܯ7'b_;Y("{ҋpw%t,=\<Lm;a:̋"Rco52g5g7ߡjgr-!d`bt5O6+/v0K\W6<>^S òZ^28#4,-vѲl.4Tc+L)e3'q8da} ,CF>؉Dfw׿91䐜X"]UUuX5Fhsc*Ȍ!eMY3kmdžp81ޢ( Y)8KS^6NFܡ2g<iSK% +:8qLZ*"BRbqXi,9*x=wu5$b0\jJHFWҁ# j$fRb8I- RTEC=FpR)kPj&"-8VR:"È ݇5dnEnsi??in3ױ 8x0LvC]ԍh:aI+y0{5Zsϛ%U_GYݸAS1|3>f>߳O>T8Ns7Ad.$~('+}ƫ6ȑ [P`KV޲d!'SGY=_O|G/q2p{r&y\9R٦د4+Zx'ʸx> iA_OqӇ̆;0udBuZ"Nܗ).4'&-tC۲nm:Ae&^HG1O] jL ,%~tܔ./tb=#fQt)h[&ҙʪnEA.Af6^"+\e EYJ| dtEOFP"p.۴,כx{yjefRoR]Rڽ(z:l*Z Ǭvp$[0r.U-OÂs~pD|4Zzq6dinCh8,ƹ$}Skc$/ cKNrM7 ,:Ǭhdn@K)*Ҧ%X+K<7m =v D' ysOCѽaN!nApK /'-C傞&o/"w;-nл)`c[~ Z(壓ǰ#էQ!vOB:d7|z=NXġx;UFC}<Iqk.KfS/)H&"4F*b $Ǽc(pxF JgPKȴ(a"U bgaQ9$ib` @J9F y44R(V*7~?'*"u<+bsAoLo=EӦ5jPT} U{ 64C '%ׂ2ABNr1֢F7w鿒W|5f;ۗI^Vؕ-muGgaqƒH7m*#-~u\Wm`Vv} ϚeF+K)_u&M 3n|JUpR0nqՆGXK-¥yW-Ƶ:Z9//~]#|Z$Y͊X\w:47^ ݶMqAA yHI])OnGul4;:m3\2 Q~_:гWZ0d5wf>e"bj\cl<$fIxǑzW8wUJu@ ׾վ%Û26w-jڿ8' 9NHa2m9΄h>//:V|M 1㛑}"ǹ܍k{/nɌI(g=Mơ玳HV1+Pg[6l@滈Bԩ%,|u2;m&ݫp%d;j\yPyҬ:W=͵t uh%aU S`u |MP^{4@ӧySZ~ȝ6ŵPK_RE-xd6zlh<NgmeyKLܬkၠִ׬HκҊP5+gϕpfk&k! n<_55hw϶8fɲu?nPlJtm d4ɊFӚNNHBi"½+AsFx.6c2Lh,} @NUތ'7;=FvYMkjD[}g|YDfZiԥ>VǾkf.P3ik-fY*ۑu"uxwWڲ!@_WSر咶hT^ϧcMOuZJl! ԯZrK3sk}3)_nSbgJ55k_hA|+򚍎ـ~!q\TxEɆf{͔@P!hm\5<mDֺM*渖;u2al泇ZF}Ҩ"ܥhvX.Ubv?|ÿ|fے%Hj( Ե-oLs-T!*Yٖh Pq8zY5-Dt(Q0>T)ǼՅaj?#\ݫS j,25mv&v> V+yBuzps4&UvBn"[+ .+s5a4i TǚX02(D!08N\Kde%Y-{bS(P9n1~BPBcLQE-0XRYx-EsK%]]]+뵰ddng{ףJDj;?'U(.*쮴},z+~E @Ts.@wɜ!X[dY$ײ1H$C"Zc89V9K?h>t@-4e53886 ĉ1( ICȽD8o 0  C{%";TS{\6lŠ-u?vQ8v&fK qsGHcQpg%1=wu5$b0\jJHFW! j$fRb8I- Rn8LT{iN*e 4RD'JQP[O@d!sVQ[.a3T0NvgOT~(yXYgP}`X]ԍh:OFVWa;j7KTd7t* x7icV'_Ǔ>Mjc ;?";6^=d|WS'8AP`KʎY$}(ucBO|G/q2p{r&y\ n-y)+@fCp?QQTHר>#n0sE0udBuZ"Nܗf =-L9 WKl?ґ?go˺ݶ{P/$ģ͂Ӱp?7틿@ ]XψJ1]Jb+ZkƖɪt[ftS( r 7'SggXEμxepz apɨx8r=D=\YQ4K}pvX !@%t1ޮXf{E/tQ-SEY\H9q$[0r.U-OÂs~pD|4Zzq6d 璴]V`$`-`KNrM7 QhSGkn@K6 q͓8xJ5Cǃқl]>IG|D.,FěF*5ٻF$W>300m̮!)M2I=U$U"Y)2`#e/"#3R5ET`q ybV0J떘ZYb֎qڍqߴGk2@s Lk&sl,4z6ޚIm Ɏ~G֎+::nU]G訕)R l׵+8ʫ4<0 m*CA ƙARLW,pBˍV{@5Hzcx[볽 ]C;cǎ^ "hhv[ziPq7-$!*1taE$XLjAu,y@r*XTG>qUy`$iQRƌƁZ1b"iZ+IJoY`*7eam-6|6 {S\|(4d?XMljn~~[2>޿mgߒ mfHXޘv*'-}'sLLvRH+[n3_,,w[ɨ-D?DeśVgx <|G^j%$m 0.`o\(aN(E f-@~Zeuo׭ ҹ6XJqTQܦ_GEdQ0A[R.k%OFұ#Yܪ9y8>+IV^6sBz}{]~G^QE0F&r?l /DhPHC K)FDoAsQnu<G@x'21,8"2m k$Vʴ[9 wZ]ĒE*L<iX{))qJ7nbi1g;Ja8gk{lxtZ9/7mD1gG-э̄|"2 6rcL*%7;&S-3ibZZŅb؄Hꔷ坱7R0{2|Yّ+KE5K,vraWo8<_E!@wEtSmEqIQ+R[0EE2&zG]pi/Vh!םuUE@B,1O|hֺ+1-AU3-?f&$jQ˸JeK ,4'F] s2E}A',X3|SG乒QQP0$d B).RΣbٸl% |;q`% }By9˥g Mgϼh^ǯ]vިMvl\'|% AXmFRj}aPw:BHIzLwB[퐔JLsu=+hZ\#t M07y|"Lu7%-KwG [fk3.aVXh'כgT;@S17ǹ|o *LEV?u-c6YOUƼA7 &AlƧ#=;Yy,ٛ/>kD"[t5}g-qGHfE)}I8g$JqwrfDsih.q.^Xlg2nxboNAJBVB yUr9U bx\?Ƞ(1U~cg=Kißq)'빕yW'#)-Tgi~ef ڋ6M߈'Xğ9<3t*zX54ybF=i8%7V.q.>C+e^VvVϴذ|3 D3Ϙ9qAa?fؤ~7 VP;/F>c@P!]X'@eeUU+ i69CTmYOyrhW9Y.7\mN %S }\y\avP2Y$^hiN;;m4h"\' U#J#+s4}tQ6ی Ka|Jj6G63nYb1LBjس8[,9il !jw7⛩'R-Rm8];K$ 4ghqs\IfV 2||ќhioפ,kjܟ=tMj{[rwb ܆Z8խT>yYj;Bm9T](8L0+A|~-L_U]N}hY[DH(򅖜N)bxЄ^uu^io^I0w 6vi`ECy7ڧ;V~N'gOF 4.@{Cayi>g!;R[ֵ>` \)b+t@HmojRz\Tkn)oξ|0ͼ|Zէ0wc؟.m-lmCo0J-}6 {S\ f*^ori໫ӷM}57 ? Gjڽn~EΖWmckJSghB+,ch@cg^0Onrज़]zЋA.!F4۬:\:w87uLf6!528]RWrC<2T|1Ue~:'F,B0#BXYL`ZSFD,a$EV|RD[+yՂv Iq/$g_ g46t[\ịUh4BkM̨3hx`) NPJjec9sAOg8-O/b;lf:^廴ѫ̇iB1 iԛ/vfN[S@(Ŧgz C?VX ^;?&{)~ԻXFws_?~uze^dcgaAߧIQ$uJr\$uJr\דܸ?*!pkmdJTdB_rԺ qcZ9ؙZd$di>Gu]oWJe}@7;BW:b=0`cL5A cZ^<վ gE/*ϙvji mf{VT! :5oNƮӦi캾iiiChΩhP$KDp)A F!*`2:JMҭ`42i)=y'S:%S;M-8@) /y-HOuZEg[:BG_PTOgԸa=NN,*ʰfxsF+KJY.f$zW~}o myc硧Z%E.`FelY@Ud2Ȉ(K! 3j|d*F &m۴QW}Hqo V vgLNx8S@C27g/Vl땷CdL[d >02G0hP{㵸n]׽K^yƙ Vn>f W^f^JsL(Qm s+:2 1,kZ Hՙh-v@5bld+xV^D\™)N<ˁ2wwWTa.]7ԉ6"w+ PIZ }6>Ϋٗl/9,X04L`D$4zsK$ qf wʼnds5GXʉTca$*B6a=tW%T8 y hL4#k/%%\2N0,s$57m8 d< SF_nYo/QG,|U>5y6`nF<1Y"Ƃ* ŽyMԪy{a ]j6o!z NeluO.|;7go8W VZ f,J3dyfOCx16>/)?Ż }Ng ȟql2Kx+e.cD|qGS(Oasxs9dFU SENJoa3,L}q:O>_z4v8[ g+^_L9a)6x-Q  gi$7JG2b zMQ:\B؄A{$,ViǬQFYJy>A. kY)zΞѦK:LE|}l]]*? +`~Eqtdm,@.'-c)q8$Vhc/L)e3=¬(0 <^Fp!s5aGr*e< Uy9 bH@)sb`F%KyFI%Uݟg?_4G>_b@\a3 Gf?7hOaqdδ !U  3?%4 ~?}G> !U?a}뎲F<ʓ6ZQQ'V=>MG?rJlˆ~9~"J= =|77.ħIVs _F7pO ?:jw MT;_({Oa cI`bIl?EzZW>c]V́IS-4-?f XyKRťwP cWb@L)_qr$&wZffR1inF{-ʲh1#xܣ/sc9ʷcD !q yZ̖ld\#}Ű {]R%i<KaZkNv=&Ji\v!J5W{҃y{%2(vjxZ`65N wz,%ZSN\1:LG̀ʝ'i_\ cbS/[LMPoc$ԡnhbvzXk Jh|VFSUۍT[NT;pMuh*(g/r}!z]E!Jc1F13ɬ`)$dH9;oAIbVR3LT pTSC(xkH&⤀1Tz lZ{1[,l8(g (֕м8pWg5?qBuqC*[(ٓ?_?3?T::: ꣠#?6wSXb %@*z $"5#:Rj3fcaL5H+/; iKbCY1J ݲT,?hR2ކdt}%4{h4, M6>Lg_ 4dwخ b(d2?Q%}ցzlu&őw姀I;kշo a#bƥVfZX,OZ3N^"N+}uB,rk,0s%-p#%Ia RjLR"\G12^p7p6ȹM Ȣ`;X+\$JKFҳt@Bgqm->:Q ^'.s1@3ALk1ڤUQIKc&E:$?$^Ry,e g*![C wb'DȪ5RYX5\Frjp=3Rl] ת7Wkr>7 x$ )[>wZ ⵌFMFSJUgXM 1S$ IM戢vGEgO>]`HZʩR`RZH>Q]wb8pFTsm&_vQdmF3tS0 !RW>/)P2zEx#x+8>jK ^Ȍ ^`A Ɯ\ 0puO!ӐY>]{Lf_'|1ȀQhd0'ZK(Ir yI"R:"er`вo"88fSAbP |@N@kC:{iL2coǴ(e8XG!1z ",iP%LR#1s:58-N8mcrUs4WBQ{巽xEW(Nve;ݏu9YބϳiY{&z{4.cRL1ɳ, #z!Uxd!R&TՔ1Io FрG!eLDR s'JlvϘZͣ) 6{<¡˳œeg2ʴT &ce㤏Sڃq" $cLK21&)Ai8ŕR%)99vVh&1CnVeR/ ~×8sg90s˲8nXS)6j)ϸe)CBRj_|05j\Q!dƬW(20R $|Pc DkiQ))fjףtVj8"l㨄y˅Pb4 L#uuB! & S&a 5(K)5Ea8z<6=ct~A+"غO# TkQY/QlJQNWӇ˜ eGwe k97zgy+4TGñ΁ICQ4ZJ3w@xÅMT[eK9FH%(fEwZI%pD0AP8@2m?{WƑJ_ D(vg%{G؞yhuR@@Qn Gh KЅ:2++$P~, " ӈ@o>`=c3kANEAs-V1B7Uq*>|=A'!|k0]Y>̿Z Vy.-vum~N J SL~'m/ޖ>_4}modr/ wf [|T)d2\EjSrxϞoXwU_P,+p6IZWT=fa{rMyZWY(q!^zU#nzG7*NpؔLL_;AO]8-KK%oe?-t=}_}VN`Xf L°/˷д|'i.In@nudRPkIe,:*T|zyQ1sפчpAg(#* 2dt^B875/a?aKsΙ΅AWA[^ N )jz%]@ MaZ1 >xh dkaXlkuNaY}PW{҃y$4Hd2Q:k>|9EljGyHNSNt멳hM9qQ0c1*wX:ĺ#VN*51 ݅w砩z_+d=j4Uu6ΤS4 \ES'+}ĥMWAΕ-v96Six`,(3xn23ŃtLI_d)l5+qQRa@&EXJ 0Q)8HPLyA,F$"i@:!ߎzd 'O rxMi*bWo.ck1[/٪;tt$t$)a`d43pTFQU6Wq-$!*1tny$<Ց&VաQ*g-a=ޢ9N1$7A0ɵ r B($0f4j|sWL#,5%i]=笥'PHg=[Bd{0|𩖭h )N%=#5M6%s6m\.0EN93X[eu{z1zv [;_N"E ;8G+RB8FD*ƴcB)7j`Hdu0 " Ƹ#aREb?D hl;Dk+NkS8*5ɧ;}HJ>G>Z#}6OO/m (Dh6Y,[JL.0"~ uG=zWCPưtij,$ZE(&"l4xt ZI2q<:{;nh./g\4@fB>|fxǥ`DiVqX4>(-6!R:my+D59M^oZljPY |7DZoy=7o.hK$A)Gr%1Zi`rʱ>X"os a-+F9`il3L3`fyn! mm+LG)pgo.}7<3.@ڝZ`Sjsfȵ1OP#lRQ˄4֣oq^!`W ׯ!'d.Tt1}e!#!-F5k?Y I9)D} HB9X ˽\v "LRWNIktc):kI $i&RL@*oW12#XP1g&pmh;j]KWƦU&A-UV/~ 7fПL;P#(0} J\B:`^eLN.DjC1;9i'a iU3ؠL1hL> 'H е!E=rɃt7x'cZadPg($^2#YobA% 9Ij$f.>iָ;N8Crڣ0E8:t[)B)UƽCHcQ UKh!Bry@6XI)s$E)]Eu%;Dr Si(C%ɓ2zCHT(}E FBbT2Z=$X`T9F$F `"?M` Xy$RD[>lm8 ָj<|{J@s??U?b R+ZRzlSK*Uk|^pc (0+L*q+T7PFsCF(r"Ac)oWU+K0e/kAEDIɂXXxx4-`?  =,FyT `k bmvt?1CoGg)jq|I`l4 ;~ؼ$>644TԂ`D;øP" 3,WۡK+hC6пUW9΂^-u8 kvFoʏ5֫,\Ve=fz_fmE3E0,ȰDdF*#`1&#.e :o+ۀ໣pfgeDlXVFx}2+uﵩu?I"Q婒!WJOWw*n WzUopuU,,=Xu e]NG&:':xcF;yS^4Q|`5~g8)|քPVOӮE_JҢ0sc I/7eGy Ց -R݊ŵphVs@|7zDDXY#;:3;,0>Ȗ'&ƴSMzҔ6l loȖf6ƸMUMr4=H[^K gueM'p>淸7$,n97O:1AM7^\蠭rܶșVesMVpl&Mf`{yt*l;Bɺ4 tkW5$lx~Fl(/ E qpRअwOw;%E噊Jx) 9 nA8q!dr£5/2fD+eԺb d{TGcoj힌j6XuV %q5)Hڗ|t{kML5lr^*"p U%F? /su!'ӓKD'vJtz)0N9uE٨@^"$XtkoY$: Q> K\( '73xeR11ia\T>jKs$!xk27xj't!</- V1?*uN.ŚӵUO +^"L*aX p HSe$:X0f9 +#3&$%X9A |Bzw`@NLš3UbZKK.mAgmp"+dFS#GHZ͋hS6(0ܫ/-6Yٱu+Lb&UcJ²N2$ QY4B!%HhG1~@ki+G BM&l")c *F_S^QjZol&UjVjpuBwj-.;h^ߍuةVmoӟw0]n?pZ= L-FK.躹n-|{ۇf߅+ay2֗v{ǯ0t X?)3/>je;YllREՖf6>l7cq ޏ1wݣ[qC7\|2KN{u~ze0A /lXJ6><чzc(kemmd$KZlWS7 'F0wʖHLF J[Dſ>ǯ]6L˥~wBZK~oҙkZIf92Ь"3m] FjUʾSW;kWkA%2ѓK6jRM<c܇$M<%RRR4M,Qpܱn }bijTfE(XG(AȃZ.Y6@LXy k| Zb:lb.rv1k*mWfS=xMWa6Qᒒm?PIy݆JʫQn!k'ΘJ"+R4 LQ5񸪉F( DHQTPfJr@`,yJ2BR􎈜<[*NJ; ,R`XbfS 7;Qy "dj@)2Nj9[=C~ lgϛ`o`}vx9I{HIaҺ#;Һ#;Һ#;Һ#=5S#;Һ Һ#;!;Һcc";ҺS ;Һ#;Һ(H뎴H뎴H뎴H( iݑiݑiݑiݑi1~j/ײT{pX0Dx)3[}%㕏+MYYӤC 'tZ:ex|J2+X bJ@D3+dnRq|&k4M J< Jh2Z ASIW O[B,c%a):)pJs*OpmJ՜0Ċ[+s@{0 *hLObA2!.2A#&*Qn`M@fI<=5]=DŽT2+L䙔+O, W[v1d5LYBLi²`%\d6Jĉe?k9[ч(j!9=!'D`lK ހ& t Q$cҖ%.Ц |y- [G/H57#5ω쵃׬ UI$!! j jQINzwxFi3Rp ؔ 脵2GB&) 'yךM`o5M'\"N Dqm+Y+$>ɬJ9t^KRFmEDmdf":m&ʒ[qw!˨ Ljʸ%B5T$U-H_4SLS& G{1cU|=hBvֲw1z7,kp&qEPL_v $J$!Ml83dEZ,-$4|7x>MUJOWy鹟7:vw (])·Frj^|XYe;ޥۋ[P?zCxO_|WFP@.Cp1_ǜF`p9+naLH:EΠp~~yNzEΤkf3R2P[RUoDuٹ^ߍ_ŧN،jmvAb M5鿏K.躹(Fv}EwɁǯ0tKST͂SEezʑ_iw~Y`~vRՑ%G$-],:,S7I*T_UY"Ʌ?]}o[xrE- f =gϏugKxm&{>{x%tQ[oQSҸIPl /ন{J$Brioa8"͆,=Z`5B(5R y37PI6d$ ɒ5nr3+ΪAU͡]6j6J <ȱHzOJGtOR.Juَg&-4™WLob:+4r1x'h}..9lCƮhӠt>7wE&k'AXgL#Fնkd"Y e/$>_>o^~F7^2r?c h$@W)xi룱&! ڨ<QJO5Z)_N*F=7s?w]Wjx=uFx8>ܦb;l͡cNdYΔ9ud4qnp6Fz4hu+I&@c(J"xmd>5j%J2 y]z5T3Et (5 I! !"tJTģP%F_]+$C^Y䟿rAl^N/=7q|3!B=m8G <tٻp0dR|qr,LmUmnSL:Blo;ϛ' ^Jl*[LZx;;\>ֿfn4nF DrNf o#γ$۫@ծ{]鑢_9չѯHF!DFd* HI&Aq.]$^M"I !#zj1!15aZiS#z(o%m\e<@+|OD`5}Ne(2ϼ (=6 z嗟z3!ߚ;w}0''|V[h4iЂ0 7qG_0-@CWV\o^kiF)$} i%_{gѻ* 7/6t5h-%G\e8?Yͣy, 3 &87#Z7P֜s6_?.$)8P>ϙk%LO`e.UP$AgH3֎1չTyYqa]"(9D^Ez9NΫ@Y]oswsv&wY9n(Z fHg9Jp: H3ZTF伕詃Ľ[Dx;WӚWW HXOAk-9Q N7||5ypGZ]ФC68|s+\lpW=g^Ss4ugԕjFH4q NzT4M (ԇ(>&geDdXRl L } {KT -hu֦w;ty͗c9,Z="9ǬBݚΙm1e)ԣi;ggMt|(+Ipm&ܹ,ÆB6^&npt<<=npeM6f;!lpd3ݧݴ/f~}3.{H5VFa)CɷL*jz|zCi%k.k xa8nkz,U8yyvxs]Uϭ'ͅ"1&˛YkKY9h\Y+ljR&M^ބ3 !gqBe"QQ'0j Φ ؖ"3KQMHI`rGe&&a,f4n0= ;E+'fEFѸA DFr=cU)@"A>ٳ`E;`U'% DT8h$(Vd$=( ]'a/—Uƪ#@&)ZGj&4ęi*BuUB:.UIvHIh ;Pr V`*aNRJ]MR'}>/rkPp1Z(R"Z6<I+ST\L-wWNVk'0d34Ƅhg)'AL'Eo[`G7A0-dũ\BrT  @M$*\"mK1rv x9侺̉xv/_ۡUBlf6fQ{\}lx_0BeG#(,p%]Ԑ]$ B-QT:n,!I2XQ wVTY.+Q8<.$pTkdP*\zi_z|vU mr(3ۯi2\e}Gz6)3#{M#ß,'Me«zSj1X1 8$yLIb~\DŽUZ*B\ LSi@*$Xb2ax1,zQCVX:؏5AwEVҴC0Ͽv$Ti&04_`1 ?v4x&2d?+hv)ZN]\?̯C WdX6J$9.!A w@p̦9l\s[~__TyBerݟL߮}>Gegn_ހxo900xkf8op, 㫧-4<4D^W@(1S4tU׫>ÎmoEM!&(ϩ.%MFHR;g'[wH+n,if6{In>=H$_[/zjY&l]UՃU*bHײ ;dBS{`\qYmcLD"tAH۶5-c_ .V|efq}\+IC3 { 1΀9F)wblr!$/:^|z+?ӲS7ɀOF`s^%[ rDQ29@qQ XTƫ?3Yn ۪nTT0O2!#5(IE# QHr8 Gw}_{Ƶ2Np"65Qj<uJ-YJSq[av9pDጵgx´A8iC0zMB@.Jc[{d&ugH(QWs:?}gwC l^F>> fO#Uo_N_u4~n҅3_kscCc7ו}tG8z|;D7'?_~~F@nMps8ļ._P6y@NQ>QmēUuίAIV+@trRcϿ^WMՇ$3Hm}M?/nmijkvƍ#OQvAQXZ޴]?N.[v^'W"t8[X^,g+(%o[ٻn ]?td21ٌֆ,ޥƟ/U A© N6>Ż Bw6#|$4ftdIO\8{4/Da?{o񛷩ۋkHsp& ϛ3uֲx0 ?|<\Tyd'%tROVc]6FN9MDdW5|si8E;F9߹.;SPDj̈CV=YدZ\muɈ4!r#~>Ay1׌G#sqXG=f0O|oUXɟ M׫A(P D5x9ޓTՓ-%9RT}t >ƒSp'$dU\o,*tyǨ1LPLJ#@7y‹Qh& 8^}4I%Dx&“2hGng/TBz_7rZbʊyպ✈f+'ZDŀRKȶd ,ϊ{΋v8̞Wx+hJ Gp$航u1;clB 4q$b!dէo/OM'?c{˅WtWW1Bw;հe2Ӯ'|(˲bć:sJfّ?~7Wm%BJ%.U\*L h!re!eUxdހHN;kd4 62nQ(QIJS'*!yvֆ7*ޫ寯ڗ1* %ElSd[j7TJM- JI%i6zgx5ȔÞk)fjNӳ5m$ze |ayO_:k -Z\h-4x 0y3=l"מ{^W6rܘxD#8#2JkM-*[Q̯oxO e. |FQ&Pc N)})]f[aW0G\\Z:W)j}E-~ݑ&gԙ[ɽ =ȣhJ;[re2T"Thj"! VJ-@ !1\]=)rZHK;/D YqW5q~pR򶟃nclD :ӏ7A?&ĘP{-ѢR9T*JR9T*֖S&@sZ*FrPT*ArPT*Ĕ%/ ,JR9T*JR9уQ5%e%eI`Y%eI`YXH 2)( ,$\$, +Kʒ$, +Kʒ$,,Jʒ$, +,, +Nbzh}ePP(^2L<-spkKqHi3 igr~WfB\h4zE,YR1o.݂:^d*JQkSVkfX4Ǣ9yh p+, { ­A,F-Vqgx8O:(|eu O ]Ovn3ڡrnq]ue3Zu0OGTa [ϭq9 ڬdz]~5e-]z7=/\OY[z|Vs'ZGY6tӦd~[lSo{I+4s7`0;3>mmɣGg[+-mKIhNbw*bd1܊o~ <}]N{˩vSJ=u8n)CGp|vč5>oB |}&P\p4Y%_ȇx$ЖpAW`'٢YA[7\RxfǗR|6)vsv'b"B>@kjhJJa)Z@<2 ;@6 9ɽ~8̟(ƴhV[*5z>b_ee`0a07r>(wLFtlx6k,tޖY9eZa,B7}/qߑ,stϹa\Ӆܺe. |dEՖ)mðL=cJjeqhṿ[<޺/TZ}E_]-4FyP |Q(jZVx{  򸏮ktT81 xnMV)˘b"@N׭D6X|Bo>BۛnKKz&%i~: O!'z]RjlFAzH, 0WHY~(9=Mܲ1ȓ΁]9=ᴧJE~:e^0S j8SR(JbtdIH-s"%Df8Cz0(JqiYIEvC@\'V*}LI i8Vn4 ťq~$|K09 :5)ma]{ڭ4[vvAfAyR\>h3r0͍2QDUFHh@f\KuYUPubtT#;!lJ.JTjL&CP,EJSY) UA&g'?iiǽ&'%&&oSZ+s7*XnHc8x/zӉ=; w#ioѩӞH2{d`[p\&F ){UVe$L""(|-}4"_χ$̃nQYDPLx˴).34@lE%U@.џy,XʃXGou|4Я1:vFV,R7Eioݱ[.ݚvHH(&J61`=d`ǞPBNhk"S d  pS%75e=ބ|u[Da~m{|8K *9IK'Gq85. ,@T2 %"]1f|  86gNϝV;D!Rap\.!gG"SɖnD;^#Ͻh\डdG03"sR4#1*,+<_x!Y#<=1AK6Oza-T.&7=rjlZ,_;q":)N~_zP+%Ҕ켻r: lzvF2ڷ|_Os]$\~ '_t]SYTĦLtYkUulR'w'j Tg"~BoFxGKѨ!g7U[N=_Tn q_};{i0mH>m NO[>!4~NA,v|NOjXdwVJiW )jZzRƻYjoW?~v|uDBLv>U jDj,yZ rq! iB#Hc In[$Ʉ+Cz4}U݌gQ;WmP_T[$k '')GJ2R\} F->M@ȳe]\o,*tyǨ1LPLJ@7{\K̩JpUɦ:H6 \P{6բSZhBRULY9OZלQm$\rl dfеL2yу}\Nu)hJ e 8zZK@uNtDW('$g OĒ.sz  XHn1AX'-'&ghm HGqo܅ո>YA8>5$\K*nC]CwN!&N0.Qcb8?j-υ[yJ)"PG@=1ԉ )-T^98oJ=>XD"̙CDN O,hnHъpoZmk1`(2A~_9ղahFyU _T]Q5ryμ9!&@󼪮ߍ@ȶor%evoɅwPZ/پGd mdGy{=a/Uz-~֪ku _ށ5 .fYۦbI͇#qB%,FTkVsK\5IIO7=1x'~mY/˜@pJ F!TcT0!rdR;/ldL 뙱Ee$(D OFkuN/UR>x7q~mBGU9LG.#y,@Ї y \딓'p=2\.yvZ %.JApzׂCIrR T4 HgMNSƍM\rZ/8>zey ]9`ԓz\M(z;~Ϋjh3:Wj<}&\^7+&ˊDwƴw ` a~lXjvpq:ouaO4wgcPrk^ W 7mX5ŷU$xl>M_P<<0t+oJ*?~NWj<,EIM5Q돯^꫁'R0g#~g8|Urt{DfI{^& SdJSw1'2ԌCw!3)xʹzPJ0gg䲓ѻA5Қ3jpVDCK2GNGq?ovIYИ :R5LNj5ꑌdZsqJ$ K`]\ 9&'u:t0 pX0ST.qyHZteD=~Fih:HG6֑^sbޞ$꘸xuC4Rʉed:Y*iBĤ6M /MkRW3n=YRċ8YAQψM N^)eAmCZ:f sya./兹0\^ sy/*兹a./兹0\^ sya./兹0\^ sy}\^ sya./Ǯ~]=ȦU mA-(兹0\^QR sya./兹0\%B=m(1\^ sya./兹>d\^@PY,兹0\^ sya./_ݿxٿz`e"5N*jjEeNY-#gcXz UJG 1X^eBW)Q@G冣1eQM+>T1HT4o3P("`TXE CTʞao츿6瓫fA#;pv:Ems޵u#ٿ"szGe X 0``X$c!%$=Znw[v0dI})޺d="\?,^쎼Psgdž0U>Ey+,NѮ.׺(]|YHk?.bKZxӯ z֋gֻp͞]\t3nw '=6jzj[;[_ݝṙ<aϷ[/tODžxی2~zG1F5׏nzh7^߶of0=gW#n"s_yNշۉ%nnSj87xn˚7X9J)Lp.&dLFQS2ۯ~{೓9uey3{s/!.lkYffɇV]>].6ۛ|8j& ld zF1͖,ߴAa2:mMϮZ7_W1U mc62nn7,r]9]U{ldE6_}Y{5-_v<[d~2y9gp>t7QM -/NKWV72Y'v'|2Oy2̯gB'W6Q|ay&|70MkDtPRG_ÅmGnӫFv ;&/wo s,̲2^=LGP ۯ$m=87왟t+{8f>Huv@3)g̭Rj <0ngX# }=}m-_<Ƀ :qx>p2`m`}f.>_T{mj_}0Lg/Y.Laq*igK :QY^,{ W.h}ts/)/M#>;_aU^?>M]/~ 1f^">$i-\a|}]^.j:#'? dڏֽ^by}~qsFZ?["7w84NGM,,,y]%Zj3i3rz\Ow;z;}w]9r̠rJtjDFESje*fcyYDZe۱gX@0ΓoC [S1'1TJZ(`5Kx֣ٿ}1nGA]s!V[AԔ].֗r{sK^\ˇ;Aq TbHQ>}˼H*Tk_VWZ?ThrYo܂OvR)6Ĺ t~ f>8* dX5XPd6m2L2"98A2\d0q]vLfF3ڳi-b)XD` + 0 3)TWjHFߋ~+v[CUL[BQ']r`Ls_ŧN9ET"'5 jO:w:S Rjn5bPRrPlJSVqTiLtiLzpVOAaZ*Ey>zl"gs.љ|΅j.m܊dYϛD_ Sl|ze|-6,͕`lT.>g~܆v:_1.xx2qc:ic G\Uh2[4|\^N.DO]'$nRڭwƘ&ńt+{d?|H;a@qtd3[J~}q';WꇵľgRdN"`GN)!ZOgY9K%HfkuXrH&s&U%1*. 8\k%hjA85غ3!k`]MRD#qL1UdxqJl)UbY;\ \ͨj)38[;9"`̘ff`'bZ+4 15ʇ| "h eDG}ɒjxd]ڦlҹgp S6Pιaa,G#B1fhX\ZIj)xT "(1ZPtKJ,Y,<[- Ca[M%O C\@֡ޠ#. -jOoLcK 3,˶F`B!HLI : Ki9K%c,@F%PwZC-x*Y&?$,SL1a kt=7*[ ӤH ٠^@*\G$F gP\"B}UuH'@^b _XY #L mv,5 !d !6ʿyW #8 d BX&㭋p z#=(,l~50 lGT.$T#T)@g",QM Е"MM{a@PDMpf%BA1nRXB#K*y'h)(`~CRᠠAHn|F(AIIf(eZPP(kF ;Aau_5y_\Hv3zJ}ŬQBp|12\PAIv0'A/9%3s`3ysӷULu7Y_ʸm]-D lz0z;~@r (}l8@*&NؗYe P-I`~s݅tJ"T.,:6n`dX__behc.u3h/pC*$y.p rlbӲmR") e@;zF n%f0ZVX q yzt EqJڝ:h Ōf,mPZZ&΃tFPD pd* "P'0SJĀ `@ Z#7pZ!,CaXY7 1^ gTS! c :!Xr+"MB:td1cVp `fx@mJjWoEU"mZ&BJw+ aQ6i'>ow{(AbȕZ\mڱ̃B~X{i/Ս0PwC\sƲfƕ!خ}f1xjW1,k嘴i `4vap31 c䯧g%o=f&&9+rA4rHZ.=w 5JD9`Vz  BJ@223eHy@CX76u27 '+Ak*Us[ xr5pO9}G?t7LĹ0"HRwd$ an:_:RXx gj{8_!)e~``wp8`FЯ2T-~3$EZ&iTUWux, ,DђdbNebs8`H@JȱzDI"`(Ke$i`҆'҃JA{i*NaerXVA>Օ7!M9RsU8L\ p mXD10e#ҵ1Aұ/DS1%8n=xڛC ciܓ'/C`YP|L0`nsxApi/HUUg]t>pU>v"" >7!̇'AFpdW`bp`-p˕%. dTWAC ( [e!|ayW)Qx72'Ifka,X%ș+ד !ƪF$mKn"\!36zn_[GojH)u6RMOH+w6)IVu"p5x+ E&6@`5IYA#6|n_C/b:.7 |an;]7cxe/νJs2χ`m<|# ÕÛi8|*sBza_( Iܣ! < 7.>J 0LA>5Ri-шl`iIyij),c`B`|.>ݙVȼT0e"-'z3JG^(~RTG4rr?\v2b} n/h0XD ^ zA1Lߧѧv8Yds]HQcT_ *rTx-k%N[O\f킫#1YA2!2!2!2!2!2!2!2!2!2!2!2!2!2!2!2!2!2!2!2!2!2!2!+fSbaq: V٣g) 2F \OBjd*MrZ}Yb$6VKc"ۀ_wFz<ͦ"FV#=CD#D۫CTL> ZR-Ep5\jW#Fp5\jW#Fp5\jW#Fp5\jW#Fp5\jW#Fp5\jW#Ղa&f p4jz*"-.RCp+W-^\ u5.v-XӍϴї2)kyh"պCnJC x'c`%I` er u-(?_ =2{Wrﻁ2Rh/ _cr8}"'LoI=X98%\_B*8GTPe@z壺k8pn/u|E}d،!'դ$/5 Q :isɤc)MT LB!⹪gP? gmrC<B̄MD){A8=Jry,^vYekcI2lyH[*0 @&GLR'r$e0K]# Ȑ!2TDb p?8,POcv?&v&d28cc_+{D=b gU|UͶ^GKCK;zK;i {X߬WWٺ+'UwanzVKvfeq[RꉐR@jH)H)H)H)H)H)H)H)H)H)H)H)H)H)H)H)H)H)H)H)H)H)H)z)D׬>G QpŊGG>Gg }V-io {#֍Z6)jVZug^u{):}dُt2i'E<|FX)%6f ˪rr0)L?3mtfTXTX&Ύ BIx.iGhZp3d\/_}fNՏ\bnf99k#:'^E#6A"7&\Ԑ@d;qA%JXf\^WgɎ~v<%P;Yzyr9BMj):v^CeJ}J͊@`#NYk4+oVRR͊^c"Y6+hL Qx|=?|7h~Mm7͗./zJyJ|B=} bNCw1nge )|Y ?K hf\ayF-i,Fph1AQvfyoU f37.Qlgula9ċBc('1t> Ϥ\\.>\YUlC-x4༒ŒSˬf0xx5 riVߧѧvCYdsC={Ǻ|wp`vRTT&0 ^ C\d`ɕ)BDlBe&l JTxWcմc_*{m^#M/g˳dDzWL<-79 Hemdz8 C&,KY*킇I@ !' &&K#(e 8aԏO0#Vӏ}="9zD5U ڂlR=ZZ[I[ Ųrh2Em]8nvm!kXQq36nظ7cfl܌q36nrR!5Y ܜQSf!F$4)fe _2GQQ~t;UʨJE~:ճW$" )9f"KD3cijT8Ǥo+Mo"`x"VgB~GRfeA:mo<ʥ04-/~#W/첁q5TSY&vI p#*ll,։D(Y2'S[rj&7ghggB_Tt+3Ϭe:% *J9#Si'b*uIRU;A+^Ϊvc23:ؤpk.&`21BOHMb)0Mؼ4<:DI[@M[OX,ĞXJ#']hyXڕM9ï>QM'OPR>ACNtp0Dބ§[q"W0VH8Z*!x-vRjKҢ[n& }4j*UE p]i )*B\ KXiT҉|SLS& G=XcGSMCXg^"ʹL]9=Y9>I|$ $Jɢ MTl(zgE)@yixh ^#xVM )9({GW2OzT';ǔQufU$z$-\SKEe[6["W^criԚ(ȅDs0Z71Y *!Kђ(jGmcZjvC|qY tGڦ2+2I!x㼴ٵ2ۋ o8I#ۖgzpOV@q/(u+V9n9{tz&ow nߏ΁ _ޕu#8Ah|v1j?VoP6A~Q'`<|8?xN羒}h f1^m?5wp1q!囫fa-Q} څ,oGeX|Vl:ZXan`",<_xZ>kRA̿.?[=NQ~?znF2؟2|6A>-R&-&aX|eA7Jo jr:O UT[")@7a>-NwFpE:5Ss|w}r=g|`J31dd1\:vKQ+23fZ8?nٻƑ$W?w> (C1;BFd-v7N,,\d &3"evK=5x)k)9>%8Ki8vr~pDBLv8Qz"R=d(7sŸXu uȧ)#Pk~kI-eB%#], dn -frjs :Q4#\l6F=OuBFƊ~~< ĻLlF>zC>}#?nLE6Ǔ 4jsn E{/y˨d5 Lbɫ \ U+>9Pn>aQg?4:p YA :B$sOqh3t=*}lI^Ls<[ J64n^Cյ;o9󹪾=Cp أi-|??ewc0fԢXLw 'F?Rv EHhW!#!.f@4(SVD2Vk9Em\p-b%w7zȃ^ToVNsĚb$,$$Pc<4:#X"ycpL,U^sh˴&&bbIb.1K3&H e0Fuڕ]fݦ@ꐃic(1rEeDr{ݮuauP6_bh[mG.[o;co̝Ș3v4tjb:98jzCM㹦0"|PH~(Jcp%*r%T=Jz Jw@>4օuHH9_R^Բ)%W %֐Ȝ nX+zxʹ"=4|N!;#fW!K >JXDTk9( ~;u/u}*oVBp2\ \bQEr2wT0V5<" xa&eBʘ$!\MHs):*0K9_R7j}/2ofnqMa/\܉^O+0 hg?yM g8&Fk,4,Kw8x]\[jG,CQb711i.F0AroC6W-6?m. Kg~3Kgj ztEӥtiwpW*GԺ@972 #L1ߩcs݀V]촬9~㋋/*ޚ8=:/jQAMfa0r לAy-IFԱ %gsЌhM,(8PP9h5DX[yvWt{=/ Kpѷ.7͗m߭#{n\D+з$ŵy "z|sp^3&5呂~ v-`ݡvy||Y3̞1{<cnX=gWfDcbjYjj']D=KWn7h,mKȸ͠ǷMZ/$6\ཤ;,os -6[i@F}nq[]8o:_7Ş֙##D[&h=B71a&VcZ/&9ǡ}3w~gPiEYg"J}V3XHg;P ;cB/ݰX2H#ħy3 [xnMV)˘tҨS+cWO`wqaq`BVqeaSeUs6,8 :QfPhU Õ_&aEP_+\c!nDƆ)xʹU7(^o!,hѲA1N:cEѮH{l{u"e? Aܖ|~T3OH(/'N!LV%=:R*#v*Au_-5p'jpZ9bϰTgVR+˱eE")cTĄH&lO}"]@@B5 *|%`rvݘM%~ ߊb+vd=!*fGUx aRVII3b'uʃZh0=u dႢn*n)XJ>>Ll8mz~x9;,L*j$%Rz`Ho5%&rp 4 @AjO&+\Aag c4M&I5AgJȁJPۍGaŸ.:ڴԦ֌D=ŒX8";-x)p0eM8}RT&!0 !`E{ȚiK u!=QY[X#g_ =^'/#Xi s뼝ėk+_~yr1/32P2pу4Ҏk,B> 9P#Є*HdKq袭D"{CQJck VObZ~-e7J64n^C+>o9?젖pv2<`ӧ_uKۉo4ȕ_0v3mxj { fp{`f EIݭeR6̸.*D>ĔV9Rm8s=B:. v!ڄvd ,Z @'DLau1r;+\j>}>C+"`v[}iu3;*{;nU̹<u ]pIg&|I>]£mu]:{>h mp &~ܿm7ovx˴5{~m❛ s%]M'7g)7T҇UL$H. 5A)`E 8<,QJM^;+ fOO&Un}ohaeoV-~0jv{e;\#`vy:cYwA.N,XPEʰh;()oz)(6NLJh EƒS?R22UA&w_wPPOIs=M<0\X,(61yZ\3QrcD vH䰓Y@Blr`I8 Xq|g|EWw _IVPTR,W<8!]-Pa| .q6MZz*IpBz4Z{ĩpD`qIt}!̈́RHQ 'sg@D"t 4:/EΞ|~Nfn菮.6|-.ryM.ڔ N$+h@_p4 G#ψ@Ƹ@bT ,EH/_uP!43#6cy@vRJ3ŽJJSny oBc%"Z(Ad @+m X32UƦv'֓ ` O4#5)1%`A#ZIN$?<?Ҵ q\+)"juyh0j<s`R̸D; Fw@A:) P z P'2$ 5 G8wT.%W|LLoϐ=Û%$Qy 1~{s[_!~zAꋯ;8Hդ5ħэs5ϳW=_҅@3të׍Gou?)F3#Q`#__f6Pa8b='RqF5E?qyˎ{MiOR{{7~|}I4jCy5ZBWYj1 zkO׎4>F־k-.D׸>N˯Ғ_kN /ݺI ٻc~׼}i4ޢ>cɋh}w 1w H>]=Xdz R: jmr*uG*uyy1}ws|*C89sO^D;>ѤِÚ1 eϗ 6ٛ7ަ .!#\\^6 A;ɶBXk\݅Z2e;%tPSҸIPlbVІcS󽫼%bioHM9S=sJa.3/R[]C&Dk'dn{ .di$ dN]܌oZCy_MڪaYsQMk~& ZEEF DzBDIIʞ?CѨҙlO|f̣ebM\o,Lxx' i4Q\shZZ ﶘl1U4r1Fh# :PU]4iLϣodi22V=58/?db[7b_^O/5df\ܽP j3Ji* ʊyպ✈Em+'ZDuUĻxǖ \=EJFVy%ьМ'&EZK)9ɣM1bDddXQ.RSiJ8pAb"&p1Y1A xGUiWج+F%(3rPm)SO\Zvyk=uhc  ~ɖCzܡ8H(FG+H?[C#8x()Q RA)<&˘Z4]z{ A|_x)kG\B{էqCto7o÷u_}{12K1=JP5iՒ>5>N d¹,oXekmwPDPH4sۤ8`(KVrodŴ[*gHqG:Oޯ#+H! $i.P("9xL O#HIlnR2a=$!\MHsQ9)uuW Bumٸ/#{܉^>bO#T@}V>ګՇIpUh"G f+I SшЁn [hiS93U9*U,9̀u!;]""Q'G6୿_geyQzsEwO d&Qh|EIδ@ \hrZHQ8 om7mߎnȤA`UbWξf߈7X>}jSuӜ%Dqէ`Xl$W& [FVƆTe%kSs7*#͟ k?Ze= l lEQ`0> vj_l;u"a*`8NOrJ(/*N 8Kh"hʧHr)@5EA3v$RޕNE"AhP=AIVY;(D*b] H&SbO]K|V2-|YƏP^發 sH;TTUJ6 \kN_=*$'d.R.fVʫ l!ew92=qCtlyyxЭRS^/]%px)*AչB*̎]}9(KbW `_ p5/]%hϟ]%(UǮDvE)lVU*w8e_W!gi0_G}PhmɤF]]f뉕(Vi)ʝGBPX~8qfoޘ7Xs/ ܙ` 5Aȕ2 `>+L F`d|Lj&DTOZB-*Wh뿔f 8:X\](K3r+UB"[_]X#IoXFˀU'3Qj (q%&^oD`ZabUz(â?%o1J3?tiӛ".=yIoW -$DOO]_ :-K;Ny%kU23(C&ɣMhrC *Pbwӽ tnw7ݻM{wӽtn`$0L9tOs J%;s,>-8\8rЁuΔ#QC4Ʀs)y?:~|F!yx{畱C -7x4_v=wyж#yVo0Yuqh jӯn|[ʺXszOtڸ6_`C,.Zn+4DF8'r%r멖.^Z:N3qst>CnO0bQR_+냉KM9` XvH;jbe o]YXCOZWuj%4uIJ K)[/覣퓬d{G \6{z.nvpSw iAf7y:|,2PfGa0W,BĄ31#\^L (jC$Q Djjsn>^Ȗm<.tq9qxYmq2jWY{Xɑa6, Tc{KrcÎ<U#TE!W S1Π=bJhRE&80;];J :цvi_j_7X˸͂g<9P>-jҝ8x[raЪsTf fXQL;fFJC;OˌJf.5wOL֕Q~XՌUj(x(P=y-Mgϫ:~RHACP IY |BIV/h!#Lm'h :gt3NZ*THS .̞H4#C0D$AtHU wZB@5KoUDJ'0\00h6(҄HƠqfbjfv533 ق_K? оAvuOSkY~nWۻr\QvIAAT1m)QbH"`&ʰ$punU%B0cnEbD`U+,Q :xpi`lz̭ҙAPI(a@d mHD"# FT8G"F A*!lh2ΎtV՟Df025Ő<-BZ Ji2 ^u;m0KR`:h>|._1rnbAR#"JIHqIc $Zi`o)<Z(\jj頳:;iF}:b^ љRp:њG\A=3TKGKʝdPyN%)4*p0rúH"Ą<#26KzJe _~3挬$/ߞv긴1P$RI(DQ6R0V]sgK(>u9NJF;]Jk5e1b|u{f}}""qbrpX op`P@OM  G2l ۲:\.%6!``2 ǨU1kQRkzȳ1 Ռeö,WỗȱTz\2 M"p >M)#Os0( AJj6,$0à 'rh3)P}lq:RIIRrHd 3bW8$@!AyҶ߸8 q06?ܩnS"ipQ\ U9"Q5N;/4, Cw< ]%v"$(SR[X (DƢKP6"A}]B:(@;CT)H{4Lp*y>?ERNf(.~|kY-gP'~3 OCj0YϾJX?.Z -+L1]&H7~[~-n4Y2㫝!ȝWϐ*X~:lnpjj^!!y㯿^\ᢣ@2$ 5J;L0lqJC6fzo{X -X-%@z· zzl\Шf-7aZJ&MITkez68qSiix, L,4/e}?^ '0ލm9 M8$% cs>?z"3|N/i%Z訐RrQ1R"WnF\;|Q(А5sxSNBsTebr[sΙT5xq :d g̠+9z $aA l|z%ܝV_ɆRZbi@aX{u ;ΒZG!='1yJR L&Jg=?,3Ts3F00q6YJct 1;O, v\ cbS+GLgN|&ȇ4ȩI5zUMuh*)Sh*򻴘`Kf7Cr#>(~_]RO{3>R$h4i?y30i$pBϸ?i#/n4(Jx@dM^tV%pftW*n.zeX7U<(BA : .G5-kȳevûl@s,xCe?K7+;Uo~_Fo5|6pR2 Nt_gAU3jiUؘ=56n<|yU?/}֧[ Yː\}se 3Dil!Jc1F1sɭ`)$Ť&u;xwP!=)ٓgvXa'HSzbƣPJ *29iTКxto b1"iԤU.`R)8^iGjC:3Y@SJx-`Tlf]kPo!o߆)jj̥eOCW׽ZF%mp쌹sjV (+nAqڿ-hI<P U9s˭#:F Ӓ%;.5F:H{ʖz͖VK-Vh$*8ʝ7\ 1qV{tHbaJ"^rc*9gyV͍o_pyt>ބa6vIf~CO~e_"*io۴m~88lpP'ݬ)>ƸNi{uUfz /VVI:T*,9^-!*9ML-􂻿D-*^lSN1*8:#7M6%s66BKn"'Jќdsk,QV8d'u?g~5ά}HpGQ3=yeъB$Sp`Krpbi77)JqT qnխ#DqGkZ vZBg:P6N*wedrF{"z!ة>'R}dv X3פl0Ԃ#)M Brs&aLvgZFW/޽^hǐk/W}vu'`0>;ӂ]:V|,l;MDD٪几Y Iv)x>}R0g1a+sM-,'FwmHr۴; ̇`fr__ml#)/+vK%QvFluHV=|dUbNOʧ\-:m KҡP @$V3"չ5QR+ˉr )"FR"m7&]^rl~Е~e)c;DwD"Z7L{SG ŠSqs*t6mئՇΨ<'s*+XGEEfN3yyAO3yx&ό _ ZFfA0M~`jXE(4R)V;od:xyx/RB/%0-i%%|bxY+C4xtr8uT$IJ>>+$syAQ0™H:d jA)!Џ1u=I&XT06H Sq>I!FQjRЀnl  &/'ތEIs4DQ,㹦sQU!rC8 YI\4R&4'%Ybr>a[SRc) ktH8F"꘬Rz 1jS*rZ_d C_#}$V(G览 c:b,0NQ@D}"aqw˖~DJtRd|I3]@-DK}iwh72~'DpJK~(%n'u4Br;7]WLOJJT?o lr[{6T $~[|k;wBӅjؚ(*8@45JbtS+ [\C kN^_ [9qNEFU : RPs h4tJܪ$Hf A42Ź,QfX&.X8ۍGaŹ jӎCQFmڣv`FvX%rJ<%¡~Z)pteMޘOiD(IHeCCFfhϢB #T{1 ?.$G:Ybکg&=vǡ #"GmlJDIIZ"N*C1I)4Hbe9ieSh D*qIRӔq52JP,jѓ\1䲈S”g7"/եbZr(.¸{\qӐu8x bvbC,^רs},6V7:lZ=>YFդ+x`xT]es~J@ʀWJU+QŠ($W?ןHPDRNw-S RBHȼ&Z-$rUZ%HQrX39U\: &$~q*PMiEЁHO>j1qZ8뫛뱾ipx,.ܯo&s!ovj;v]}k*ig>=x]37xr$HIH]"L hRא 2N 2mI*@fT2  4Ƃ[@D&9DSKfѨ'݋ۂ'Pb2Ҧ嵳2xCާh<~EY>gT2 $ťp.Lv=p>3$L\e5;y.p*U!\iŔ+]3< 5_#J 4<+!H/ގjkk!L#g_IU%r Xʫ%QFOd?tXw$fIpR˹SGJuxueO?]`K0+8Bo:t2,,L->{?~;~GQ_/^lanYVlW+żW8˽T@o? ɤ~sψ}lr;ӗF[/~EiepVS\c[iB_rp9NeoɲcTG/fWyA^jt 2l@9b0NO>h~x1fV<ǵFAŶBBz`'vՈmo^y"TqOL3U"NE]>xSs䟇LWq]ixz>+hm5ZmJS7oqXqBɩjcȮkZގ7=D]߮&! W&1hD cl r 7dS($Bb5WF͋Lo_.n]̬krPq9[)xU$^s}xC u XmWfS kf ߾%ެ[kթCnXjz=3dQ^^m˭'<3:=VZ fMup畽 4R<\|kLBUDSuJ enZ*a,_ZEG2 N@7 ,5{tYl,36{BroҜѡ'q6Sz2uHz-^^"ʮWB qH XD}7b_?4h:Hn D~z8›Q^'z[4IrR£~ວqu/i<'%n\eRYJj:&{ܐ:6ipz]v玼LƳրy3*4J/B3@}v1ܘ3i',:Fi%],%S=L?CJҧeR`jG.w7b p-_Q<}fȊkچv>oƀ~ƴ<&[.q.usFOoG>c#j08I1WƕO?nQf+(1-f9lYA{=ntX r]'Ù&,3< <#׋2&BiOx F3tH ܜ \Fù`W(bWH~GVG웡I Jr/ 5Ԅ5՘ *:U3Y9LK!`%//"\q -\+R2$!H=r@jV^CW"$~$tߝcV :9oƴu"-//c.u6LՋGwZ /&|_' { goHꍉI T e1a{ərI;+?ZĈM5q̨58R̸(Rt4*H:VR5c֌Yʫf1 oWT_`_˗.-d1|0̾f?d6j*)u, 0I3LMDR'KL \q$SEe_ /LAUdt4:[f U@{ҼDE$ZGZngӼa\n&jmha#&,dUFHNI]ɚMYe1Ȱh RdBff(5[Ie]I$.&s,dCh-g>/ӴhfqFT55I| kA%lZJi*Ơ Ua+֐1nj +k5gbi2dURbKU29UZZn#񭎬 \l&%EX/Q/zqG뙥Fx x2Vq>ӇGPap7-sG)@.C$=pW [*C LܷWF+P^rX nm9|v~s`5>M0n2#yScQ uF2q0nqraI$!h֞ AYŚGːشR`-6qO8JX D QecrIX4%] [؞) O(8Eeap!͖gDrt< #RN)9a.#D x G%xk0ՓދXJ`r02/ߐP~[!6d_c$] IJKuv["RsPꖲD',n'26lHwOt/ t[H^R9RX&S?UI0ȟENt-ՑԥI.bLfAFNS~(XooSz <+0' s*:cҠYUȳ4}(q?ljxOWޔ_j2Saz˶G:TE{4wuyu]mх _wz2ŏ-,9 bCNMQ=mfiK>G }k'3FJڜ-aR}*Y)ޯ76=}-$. ,W׻u 7D|? Cfi'L&8RW>+F}ܔG̿CHƎMdi Bg<8 /LϬ= 2(jD&#CДZ+-41X(с>Ԃ9,둦$HZ]I9kKEdlF>>[F2QP[fMX'k kJ_@.=g˴ GZ#% Df_BCǬ[3^~I[ԅ(&k0q!̙Q**Mbo?vlz@g!;2x)<~;7d߀v- 6  z^0R`;,Zʍi"=ڐN]p; !hs* XD .&SrB4+A$)ގ1D8(ƴuf鯼'h ŵov/[|mvԫ]xuȭK['S%PSY?/ J5IG_Gwۧ9]vv2n<EO>ݓ=f]jMG8}d~ֶw๚|ītT P8- Ƿ>}\yש;%l? K2ْFW+@m/ 4:{}EKBRѺNy%1irB;Y9BN?jS՗!M*{{=x~5LU̯?ilQK}t,xO"h}H߯%8v NnˡMkakhezvZ~rn8v?: ;e#nFܜ$n. ~}g·==.Y0Kp.tZcrlHt&F "bDn3n Uqv.DbKˤt!%-T#& ɂֵ8P4xpdʤYR=8c")jfg=a0[{ij!y!q9%F pvD,ֽ$~w:HeS0@a@K|Yӯ,1ɰP QX¡UJo]bDJ#jFdF;لHZlMSIs 39hduTQ9!cmQrTMI=k&>I| M f%a#2|eFcvޢ1R6 Z/A#yDxm7sjũ >|r=ࡲۤB* b6.jSLKtH^u`JnG;xܒےfr(:1:=:AoucMa2\t0_qt0'5d0Rw\XX`,}(%ST&h+T1]R8%9;4[QP6+\t}l/d7{=E K&*dȶDř$MbE<.A! [|a$Llc :@ke ޳Җ/VcpEL&@zVR^dݚZΞn\>П]ܖhox}(8fٛ29EPfQly2FSZ.,S Y74Y&꺣Z}Q&'EUNX'uХ+wQFlθ8=@^ДƶgX_ܫ.Y@VQ*''])% (0 ɧl</yтpWŷiZbֆd'A%9j UJc3}H>IrI r.d!:sKS2fb h_Gi6w|sQ3k7/E˟?,dg85~.Uo7Ȃ&[\^LS*Su_ӏ4O%ttRIu3gtӋK?s6>].74ut)l[ğ\3 x믛s/X )1/ټ~MGN-٢fҿVרa|WWiuxl6=o7i4lf}ƷgbIF 3g,xsg_ʵ:lzk9~]o#WlY|i`q8&mM.|8ɞ߯ݒi[xTw]bH[kVȩcej)ehC.ߎ|{q=zoݔ~?hD,oe^Et!_"Ø~8_F4-F?ܑL~mHc/!]KzWg\*B.ZofaUsI]k~.Dc,:"l6*D0,CR&{#'DJJ Ei.s>N戰:sI.nWA(Zx6BC IC^ZCohm^e#WӉ/$yϜCujT/MAH+gS_|?Au:\ UJR:c:)|2Z*iTB{Oٶ,|)eL]>n'] `3(2^c4H$" &+/RR90wLt<)/5=IGcM dS 7;ڨ"*H,y+jlq-@#_}^S., I *NZAŕd:qrl'M-x2ůe0BKHp> EqׂWo_ ɷ~O~p?_e\Sz>8yr:'3bه8q}?NZO+ c:]a ?_L߿;sUS t.PN;2nYoa>ޔGI)t絁u#m7fU J2TJl/N2m=X_ҭt+, Kҭt+, K7f:#viGmvD4QoG(O&He7~S% 9U+oAr4dC 6K:{=e_KkP4DeM Bjm&#QD63(E42JX.Jܢ/x_l9{4GJiQާyt6veHsBL}pdr4qָZwg?)ո-wCcz\ѭ¼NTlt;O =g}~.Y+,׻nE3*Temm17BTO*vrqYmֹS4z[cc'*'7Ft1yy2:0/muu)%_+{V?i,[i *BLa <)IeLs%D$ 3)!@PU~ \%I)_n%YKg}U*émJéUշWX}K"|sѽvWRVR$at29y]qUl-ce3bű$Rrs2CBZHʕ>B%"hɓlD\Tk8=(b<; .٪g#L$M_DهЕ>N jLg-ywZ~;s4B-I9+ȂIJD;Y(mdٔc22zp) oV1BKAGr = +sH=߶p c Zu^~|  X. hR4wp!wNxt<)*``V0K2Co^}YlV׷"dyJ}={{n8Nn=ⲿ_J;=㵭\+.lq祖az~~SΓG]n]z?K].Aqkv,Tsym냬-Cq՗=替o=XO}Cz{wr{|Y5뢃yɁFh)2:II- b:D4:DeQ*TF+^D0lP%oY1,"Z!ip'gDChq.3[9ZQM-SBw1=)lI"s jѐ4Hfhκl|Pl AUDq6.D%Pr(N#piy3[T 먦!Zlk ^bԤ, e\.]v2OHh&/lzI</XrҹiZ!gJN7$x4XVڐZ!'*$;fcP'ְ8TfQdA0Ÿ)ٳ 9tI "xY(m$#xј 傰.Ernjf :ˬT,!s~8yLک9Gϓm|숎=fbb::qHeI]%Yޘ`=%<Ȗ2Y,$Vzop@|h_:"?TRzYYK1Qעlv,#aӺܻjG2-avKOX:Zv/WN_~QG˫| ?-^_ͨ=QUr\?6}>݈J .ώV?W,HN旳K8p{-Uq`X);X4<$njs0lb8ۑׁѹvDnG%\Ž tLq&j:Y *:J}ybX`L;OLJZ6 %};q@}F>H7:,Nx(๞a(te ZM%DY&1Zrx)|.i,HH&޶)^^(6X0ˍ1.9- O|/䋎RڵZf‹8ZyهsW|7{ن&#[҄CI.lŞv挿v挷F{hjd;1aZ^)c_i\ tf;\.Nk :A4T`QeCrH ]K]rq~"2ѪVUlQx7ƣ6Lz%JfǫS~@e9"_r3z>uCш=g]}0qgzy?k1l%c/͍Td&f*RБ$M*RZPU)6h!N|KuCF,6Ki ΆS2:Pə)6VɖcUS((@^D_-YJqkH[TIT̮#;~+Lufnj\I*!:̄bQ'&p|ЊBAg#RQw_c͍[SkP*SYk䨼;@Q"W1]V.o+l})\j6PC~JZ늌>`-EN5R- %j\vTod&ndR^ iƄHԞ-n;1<_7@ק'/xˆ|-TAb&{ 3e6UIBMT*f_|UqLST+?*fMX}P 6Q2W1>5K݈r,M;E` L=1m]axol|]N⧕+PW6p:IDrUtly#Yj+^t65U(%(T3RYٍ ~]X~3" 'Dܚ.jUsKHZ=EdeS?/X;ٙ/ֳŔMvh$UH}dִh]O S_09 LEQU3s&Ξ!DyԕiyZe~1;^<ޝ\<^Yy/M;g/^AuGnR8Rw*k(0V騼vdBT:@QB9jc$Ml,NbM.b.{CbT5޻$m7oyv~~Y7׸jCO07gJCݳ\_f秧<Vp}Xf}gUIy*r2Cw1dk84Y>ߥGw8}T69 :t%*$c"h5B ɟk4QRAe49A \qYB$UVYAH> $>dEZC gRz D8{^!ӓβzÅqt>**k(ǔ8]l穢4%IPsBD&qM)sڍ>e8DnX )(Ƣ5kRٶq (d1J nl%,e[WR}1=賅]fwZVȭ[ ŊpAou0AĐ\Kn ixzX޽ |G7maPZr}un =6Y٣;-C ]ŦgGJtк{vH%j˕/Pg6<٧a%wց'@ dxn;d IQT+oar*%:!|vy\&=51-ϣ20:_ - 꽠TTĬ4 E 4N%' x[=YsZ*'15 `FQ:%]w{qnV?SI1ާlv_у842K"E=[uBvK=@ї]ȍUqm@[Ah3r̬sq]jΓ*'ApKA\1:x/G]D>|e2\\ΟY[f.x"&ȻD;BOViU@,*Fxغa$#23iSK^͗jn>u/CG=/嗫[OdSQE<1D8Bu\6Ja cN}x|dkvZA89; XʸOMT|_|Ϥu5[斌K*>SKl̖UB$cи p<^kaΗWc`M7`:W0{BpX z(\_sy._L 4-r{Å+Ai`TOkgFw F ZJUa&Ba6yH= G&Z@z߼֘0h\<9o}M^0B19,y4O]L8Ih1%4]~rRE kwv>!҉\\̗w&ůRT S֩e# VEƱ6A<VSvJCZMrGx*Ov^)KV12'pO*l(8B_ *Ζ3+?d (mPA!0bT 1x(ugO;[c2k[ %wY7Mڴ®0F`L |z+=p?9G&*䍯&訂CS&$y͈dSN"'ՍjO-ŪF95S*3rB6FW-6d6썞 pjkN;oZ .̀3Yb%4/#3J{B9'tF 8Y[g{qW .5-[`g=$nN~8N:NҎ8 nJ$H|C%uc]OwsP9I[0BpbTPQXs!ph[%MɲV9hgqڠţx:/>HG7U:Lg0 z>o kiɰh-D ;HM)Ēr!5SkKC֨n F+C&6ݨB+6"ZE?;:X\MY[Uٜj,fu;j)\ UԒ<rNGf-:8C} 6D/T\еU39LQ}ӯ6͖ix.W3f5dj$]\΄1P)1<(f v7x]. yr>$| ʹS7Ng,*ؒMvft8̮:1bmډDpA` $^`(a#g6݇+;C`p@`](X}ƘI2C!F7?>?+mXRs+6Rn|-kgsYe}ϗ8^;#kKK{џ-cώW~v7j8!9 o6|Mb[_".e9S0⢧@RԺdU\9;@K&k[( v<]$X7LW>E? -eͥ / %}]K9=)>.תm?c`n֮:ɿ.E~Tm5_ɦii u&lHAպ"Im ivDM}Lx,^h=?}_I\77^~q%i&V+*#w Вݓq-:ƨbD9];4 Aҩ.KIv)%):qv_wX Z9_^2sWzUؖLgxgt-\d=HRcI:r4#+lۯznɤJEjepUs^˿ߎ37`/&e#KIp٨}tOp٨ jLp \zgvǔx4$ΘYB!φO%.!YtU!:4Lkэo**#⣍:%=ZTrU!G[;kN&XhX_.aLUVHLY ii &4"Vu i@ `SS òH)w0{;#%މ;*])ΠHoY"*k`m ڂj("ZE%I4pECO]‘n97H.j́ǙHK1BgІsq[m8JuXdX)L 4p!< g~L0{3qp&d/Td F]\P`T\9DUvgM(Z flRTRVF6Q\,89H2.a7tNy?2mܾl_9xyp]Xfl*ۡUsZ/@̈W]pS]2z-׻[V,+5 'Őkm4-vY7y\ARj2hPTIg5@Xɚd|dSwfUCUV\;bp!aйhU BuLʣ%Sr13Fݸ"IUkO^Q4gLZJjV` UiL(Uɑ )BR1ar_BQRGRJ|!$P d`5eiD#PO AE/T.rmp^z=ufVOYsQߖ=_JܟX+b !SR\k.cXd-,LTC"]d*ɂY&Y&a҉PG$Jeldlƹsګݫ hk]fg|&w`C pe%D-"ff&=ļNìeɋ`qo8rF=ҡ#ɝOTT4qNlYJGHL .kvT'r!'Cr wc4ky `dA׽'- 'sI!S >UJʠk ?뒱sOݭ*{BY;uNYS#A * (Rc-&ZaN#̙x bBj}šr`T zϤۧyDHDurmS>׋fKy6$1n7>ٱ&*Qiwa>/-zQצ“g?iG񤶹p:`-)Rw P6V_tZ@zV F[ Z2ɮ}JJjF]-ENjRAQ=1WD.M\|-D c%[3v#fݰJ7]؍3F]:tᚢַ󆌷$a]\$A{3yyWLoe!N&''O5V؊XB6@cTZ\4+D1b}6$}x(ЌX! LS`],gBq(d*;;A/rtZ6L}ݸcWm:km3jzX":jiQzcVA촒uׂbʆbH'isUhUV.$ BdQ Dהhl8 1JEe1]/`}؍¨HhnFF8j8>ZٕXaإ.j-9iSXѥdc7mJYb3!PZ4TRȖv>g :NI;mA/rQ[e}uv]"v֋8Q/JH%6(娫# .@JJKIX;k.{]aBfԋً;vՇYnTغÖ7g~F |QF~dFGLяDAaǘE;佹vD~:އC#]Uy@E\XEW0W. RQދ0Hf@4O+B)HT RQ&R;bBUTW\m$("!UcAT8Vت癈־(]o#W 120fwq; ^ |ںH^ɞWlݲ`W]$>~E-3 r]'ߺUz2}V3zN}BMOIJ'O8KR1G(1 \P`Y$Ed\H4(',27Cq@P[P'Ј\لa.DT1#ʁҚ8-x-zI'MޔU`NQˡ\S<~EY>gin-<[~+7v_-EJ[~_-vϡojLaBQ&xyBŀSJ31/xd|3+J](5Ϩ =\s7ZFX>}| UӇgU|aҮXaRb`ӶcUp{3q>Q !QIՒɟofԺcH/(=jwrFA0VT,%Ag15*>5[&~n/Tu8V_`fWҡLVr \J 'RS*͛)_5{U^euE/D=DmP_,g 2W20XeJ &/s즤kd̰^{p7抜+>)GCsOƣ)➎GS5Mޣy GN'W$."]"4=\BĮ .g;KL9T]az̩wuQrlNq{ i]"kr#:%L \qNwo{p}Jp:iW *{EJzzpeढHXj \ \i%:\RbW,  l=*Zs*pEZVeR@W +NS"|xw_\Cog< ăO 煫U1 reCppub'A 9[;H`ltתSi0]+i4CG/o_.ʸ`X%XJ+g wNxψgD3"v4#b):2pʗܭgL|N$c1;T3Yk/nbE ^i}uo4f޿/*t>GU׮$^u!1_%w tR^=uhE~{oRɊJoDٿm]»+o=j^)]JRݞt}t~4gS|,xȽV]9o3c}T[)sR/_sR>%ܖs9yo i7X}-#Dϧ-x=yt(X"ZX,J=u蜧v,-'_7Q)5p>x?Rp2J+5_;ZӮR2aW|JFK$B \G˃h6eU*T&w5<] .l9KVSy㰾f0gڢr+Ut31 )pР,D#SW Re45X-71Ne4HH:#j홋eybZ&e26}'xƓCF!0D> <s ~R*<ΘM.kM(峋dV`0* UMYgό'og3z=H@y}Yt}xbscm񻱺 o,Si2^>H#$n#ps ?74Y&Fxغ$#2}Q%7ә4ׅ\o{F+ϫ8Xw%+tTg'evȷ'ƁV%K"peA0fR@ʖIIkה/o_Ž6(^GH*jK<?s'xrr*tjéHV%ULqK 1,lYiaߔz#$P魒Pf #%+_gI '1TBWKIk&nZ0Cw&Gѵey_$߰^$Լ(uM}M~+=(?z&Yϗ+*⢢},yBE.!p&DldV2LKl\S'NfI1 -5੠,3U!ܗ;MJ+Ž !2:gU D_q2eJviN.Fdڲd2gɑd 2CYlS2Vks(ZjKfv| w 'Σmcd )1TR3896H-yQBcR1&Ц |~)8r9IE Ȭ2"'γ7^з+i- ZQpsI0noYfJ Frp9Yk 9#N"e^xaHKhG+ѩtȭ=hyk)n4t;60-LIQE*Fi6!xqJβ[!n'*$4Ӯ_k*rA7?I(@ YE#Tp:|օd&@byg 2 hMC޲u׸z;-kpW~S.?ftcJM$J< 6y`Qp{2̩CV&M~I)!{qIԮ\hY&gw?/ʊP1L%WY@rr`g])|$;wMO2+ǫA{\4+V,z'QrȒQe?#dKLĨB,WƀK ޣD;bYg*阸BEAYm/&%Ξ)zޥ1xG㋳ rFwKAA岏PDg)&%sB "x/8 1±]IDI+N_TvJL)n's,t ~DXnd0XN沼CBB 0R>> ZHq/(n7i=:>ꟓJB\`P[9s#(ƳI*$Ql*%KWgQQ I!ẢXJab)k:FŽpjs}fԮ!UIDA0E FƈDc"AKLJzeV Wjn7gוz!M~/A{QL_߲Z\M#zj<.|\̖>x> m[*mBafg6/xW֍\񗷫u4i#9X.5z`0-GӯM@CRշx4x~F^zQo͖HmU/Ząo0gMneY,̌jk%.^;IDtISi`85L-EϺ\6_[v^A,{MH#-t?WZyҖ2_nhՍ%E[*J-Zgj9kuE,u~~vl 4dr9̛O]$]*j(Ś68ktmx!ⅳ}r׿WVWcp4u>a lJV"F6fhPjWu2J2R5#.{5>vK4.ʖHL^ ,}yVd,h2,ʥQcߴ^"˘~ ?iR:sd?'|iHb-%O} h6WmZ[H~XXxTᚯeh-0FI Aɲ!|HFbYHI.+%%dһlO1p,'t\܀WA(Q(2@h$iyK4V;췘Zbt8f=5|U}j@M5vJY_*.nr U8:k+)|Z*iU2#*}a \gk#"ikxΞ*{ 7D v̋-[ I̒ɕ>%5I^$p[ȕ)K;&sf:UDlE,|*G(b![Qy0\j}<5q u>o9$b-/INOwjWɱgGĎ+41_99mcU5ȯDhTX]ďr"2J@TIWAroYCX#饃7*1!-PV(<ZIK]]o;+Bc:!YU,2}3}Y`bJMr[d[vؖVwt$OAޕr )+1x] ȜUVҖ{P^X{0]+ev$Yl/}Pfr_[3ȚzlvcFȾog_4_)~?wfۺH=U.\yЦ~)M+w&uvҽfݸOf!*HcUxX'w;=hmՅ:e+r6͟^ZȬ?_WBBsgBܼ{ Z^#y\CLޭzIYɤ{˕~dpUsC8˿CN͑Pnq .`Ip)ps/% ಡ7%| .7ϲOəiu%'#%DZumC]ӄdyڋnwۨvټ'Gv?QF_}>_gga~N CbҾwDYii -q~] V#Nԡy򰲩z~SxTH5=k3%>K'8,NHgʮ5m ڂ2ɵ &'aCOm# ouMm/X_Z&q9I*K՛Z^II:0;⼭] Ы4 3t7^deZH^e<aw| @b:D .!5rT"z| "-8@eA!P&ʯ!U^c&!EBD^pRZ%-(ۇl{`딵uCK&P-螅pop—mekZ^QF6V-jP+T (V2#S28W${ 2͔̔(xQ (^Wj)Ң F& NT JmqAO/vBK_ƦIRKߖ=-(UyIh7%JYszH(WC-{;d29S "U)MREe"F u M28i h-Kjg;\{Npں*Spfд?yIl-cȢEĄ|GYjT<vI@ 4q270L0S$̽0Or B_V&,eR:I#I*@|;Mz`,ZGh(-O^azDrźOr.Nվ/dR5s΁jTvp.< Ou!SްBզ'N)Y:U&l {W]-29)EK XE})9Sk"T^A:<1T-Gc{>* ~HCr=3^)UYYSA2*EȀyd N#3av\\QF-TPu ?n?Lj6Q&PsmK}W ,9Ŋ..YT*mX|WQ|y$X~[9Zj}? cbmĝY?r{Ojbu9 X 2aRb.\j-r֑b9\sBV0^ɭb^:9ˬS4u.WFYɡdW+.j AkPf<3762&.|\xsS{Xo.HJJw? ]_/.Œl͡U2-!YdBX4+DŬ1l]2$}='qPQC3B6 F`9Y3}Hj(TZ9`sQ*v){ZK 6يk r6}qZ)(q!e-_69&KI,i E j}(Qxqm'א:{%"̋8ċ?ъH94(-AW&I؃JrKIck;NNg1h3s`;!̇t>]yImuQ>{wGq >\ǁp7&g#9k[m% >Aq[*ݭs(g'ɤ3;}ꐙ;'ڕFce\'7W bքyZ_QRD!*p%Xg+:#zMl-Kh \BAZt hJpP&B{@7e ;_.~ϣ><ΗlA|uc )zB7>S]|asN?/hZ*HCF_RHJ dej R ˅~-5bx-: `pœ0J5 }Qd!ȤI}g:'(J\,.$(WF;^w ʻm߮,XlQMp;0`a ) [TΪѕ<jp ]5CDWDCZgFDWa'TCiDW<)ed 6v<{W .f瑱ft%(P=QUm,u^bS_Ѻ~ CYm8_\|}ts4os;Y;=;?8zWCo#UC%Pͬ?.>?|ZcìD:@pf;1A]Zu^؝5lkvӕ`;Kٯ;M OWׇ8Sག*3퟿e\\ln49P=~{dce=_vkp\|PU8q>.>2@6'? 1_fr@ y_7S ޘ}?I^]21Wyl6J{i9|7w'=yD]P]A?;Aii֮Z[䞿VO巭+%!l"r'W{o/o Zd_UmCxjNԳN S>K-t)N&GS'U/S-uEc0)E^:Z\V6pDfmGc5ƏŒkhi\C6Yrƒ3AѕFFCW  ]5NW %^!]iDt3\gBW(:]5Nt -;&cPG]5/jl?z:]5<5]5FCWhOW %^!]l#+,hFCW .X誡e:]5~rzt ?08 ] \ԣptPNtz9cqL'a< KzCOWi5ҕw옌A쎝phԕVJ7 /w79=XcU~O/Y"KR$SPmt%KxA=kAPA쫠G?lL)fsPlt\#lVH6S0U0 Ɋ2 !fJ2 ˃C'l]bs1lu^k5Ft@ J, idI' W`{uU{>]zX5ݞn,:W .mfhk 9%Bo@)T)}e)/BW_oWeUV9Ŵ 5!;Ø:c2#JP,P;q=dѨ-bχ5Yͱ'C-l CIEgiW1&kRrT $T &42[p4_jizDGBETRs Vjs&!Xg֌Ǫև&#'!Ց ꂅtB(hcٓ@s$+!>̪xHCػ~T[%RKRm {g/CDc@iT?W`A [Ց#3BZ\AmH]$8g_ -?'#[>eOtz]Vn3\]s֫vW(])чj)Lg>tq4o`7UpzG4lmZlq2{Gz?.׍l/R<9\t2p~jeuGxn|զS=4seDTGwuf|a6@/nqz710*ZaW7x]8ζ,-E6-no7nٻp3dej&O7F9Gǿ[ջW<@at`q#6yeKW#BͶlsgQ}J-6?L.S3|=QFhl6|~nnύkBr_ؼ]\;o!#JPS`;%H ϷZT': CɧVYZwؤ{-m[U޺4\/(GU@n.%JovDj6߳}cZ5 v086D)2g?eE<{(eq|m*d$ |Eۏ5LMVM}iUk8mgVg=I'F>BWcv(QZ E6}p@=)Q;IP"cʠc10a۷>6wZ=-m]%,6,+}xOg: vg{΅w^uwjAW1ND9#(D\vJ_6TR~Z^E\; B:c*!<Wy%04۫_Pr%\2eN7GpUn,lÑ(c^$'l׌?af>T`2}s!#"%OI@sܠ :lmv4xk"&DHFE+=ՈR70N*Fat_ />FY/a}*o/ci>_7~kajV3X̷ǻC>y_\nV0WxwLbN0`e8tp+##φIsd'?L'xG _ۯ`|t7^\n&nч΍[-^vxeyÁ% %!:2D!Y (" ODH<aI>hb'D{{˖ r 򏿿"&x._Z{]NC]&g jOqc0pҵdz7d)>q; r3fCbEv =>X<( źϤ<rHvS 3??ݳ* ^;m ?@Wz6'GC˙`Lr&W۷CJM.3o_/8%qUr"/JkDbJ8+A8CQ`z;J876zi 3gR@G8j!.h%Q A y0k7Rbb#f`"ۢ˨M}t{o69p3jP|;7=~susϚ>o Z]ФCVV9|skg w<*rUUK|;H4q NzT4M ԇ>&geDdXRŇ(!j"T Z:Q"uZ$y=Sa1rz6;0n<};'ʏ'nuNUs[{t_+ذ=ftݞ^/>)m&r/ii ubufMڇ=]^?OH1^u=ܺjZs2?́c6e-vv>ypJ7ZtO˧]cQ#뗧aL|@UsS{>ōQښ/Qѡ!<6//.:\>gZ^{+ +hK7XYټΜ̣6T!`4TS?G-gyB格epGYZ @Kc`*$&(F)D)#שe yTJpBf?NrT O'8Á-iyP:%ő+ۡnn?jW15q FX l̮@q" $ccJ-ƹ'X>;#1|*ZZa}p]H2j.12.DɵP J8/))f`rc2vȱ s7|w]TuUPkK;j  oq&ŝMd~V/H\|2+v)R%Wz]YqbNc ϔ᜜cQ_,;֐EFe"Q푋'0FHQu@YN\d#29%io.p N$J")!d`$PO-!1508O#?sڢ$P Z6 _Wٿ%mR:%;,q[7%obFYUbx}mfpou)R!C 6g>'^I'3iԼU"D*jp:%7$qr}ۄ$wFkBf4(δ1)B6#T謏"|SV2QTg#OZ.<`quwyxA1x @/ U~6@1l?T0kԺ1^}%(\8v&80__nODJqT=gUe]LU&R LB͍G3K.s#uLQHY:q=whX 2Xb)M܃  ڨ<QJOu%GD]znrvx#_ܿ]h܍22(gFw0-]>k+GnbRE.0IfX`,0Kł,vjZ俧Y(>>Oǯx";B.o^e(({{W9<&֝jQ&vo.6# :sZns«hWjf;1gbZy֑EYkWځtLN9 U=Xفѕmt6HJy \F&Z5ѪꖞM<ZqxԖtTD))l~x^Ҩ5ujWy4Yɶ[Y'RCJ8+Dmʜmq=fkSx1D\x)fN~V:~Eۣ7)V_ӟ~Zݚ6Zxغ'x&k?/?z%B@R%(RV㥆~L.b^b-{جa1vJ%hG,P1v(l= .FF5fWsMQڧ쁙*p%Gi`-ά'Lx0*%NDP>yd1 [D>sLk-%l:ǃu簎NSHir²Sa U5X(ʩE9fOE+(hPX@ \\F~v~%j.6t8L.'"平vv~0\ߋrQewӓctql[oᔗ'[2W(o>-NE'^֜tre F4${iwWP43Rk&GlR,\rl0V"#z9){-SS9kJF@U@72vg?2*ݰfLX}`Dޮ7fⲝz&O -/jgOoʈft@1KÈ,ɑc]BPJ,cF _T+М2" 6 Mh3鐩}Pm(VDRK9ǢvNFO=1]"VTK*Heي+vd`>rUYC"!EN‘x ͦ`M.Q8 F1bq uNqƱ b78MgD4"NiUr#\1*&/_JAcKEtdٙL3RJ}=(d7.qH _kum8rbo 23'-HBNtb;M8": ..q鬳3. '\y˱FD/k\uNeOITHC .>.漳v3 :rKOnG"d\AD X&e7dƪBg^M /SrOcy/}rqya-^~C}+|8՘IoZ*) 5A*!fAفW Ś BpTuB_Ϙ|K!c/ Pn@c4Q8-^ Qϯ궞9^}}%)9+s_1yY",oe O|>Y EAy5ؒ̐Y lMZl,]*LȿB9frrP#@FNkLF򅵏E(s{〻 T5hA"C:! ŖL!A\"5r&-:-77%bs1;xbP-hR*Tpe[eUnJZbQs"GM*5%.}bbvDR%yj$ض(qs.Ŕ)@Aޯ#s] 2Yx,%B;1J//5"GZ먅!NAHTJ;`)_~CCaƑh| rQёO FҶ=M!X|Rii5j5cu(и<ڮ1,>dccѶ`C,L^Wĺ6U)YfY:;pbodc#-{b~akq.V;q4>qwZ=u~"xڃuHGq(zoH2L$# $)B5 W.SV+mb)\M碰Uľ|S=;ucsmv]rhrI؈۬Cqmok3AF2zym{zYŞIlݫPF2rHv̟|]hoܭ-d||T \Ogu('˺u;kfʘ1_3~ooPg YznofgizBj3n!/g\?ى/xu6u~ЗRƻwe*e" C&[2R,%G)$"&DxDhV &`x`p*+>9@8tI3i4LB%m(#pEs#-Qb]ae7nO#g*+T1YL qB.zYӛ'>8yfxV9yd]s%d#\#kbPj(B2jU C4Y}7eVh~i_R-6{(A'ԃHQo,pxRotw aNV;q)P2lF ^!:x/(\kr!_r ( ej.K/` ^|5hYח^k- ZU{8;]io/\;ソyy0L'3&[?m)s)#V6X#Q4˜uLr=XfvӍcfm3Q#/=V:VnTXmofwJFs"hJ.UAI- )˯PPH\Z+]¬Q%>=hjV­r)Q{"abC㣒k.%$(&XPI oJ艬C5aJHŧTКti3M6"{&~0tLMZҲU|Y:Џ(!=*{$MAxB*fmuRCj5yUB /h”tq t&f#L6d(VI F.zWXkh_g˙ ѐ6Qd1mВ}! NW_ yTtnhgKmLfv wӱt1!Y,EY.EދwE~'L\\.t 4ߊ~fOZSAQjCB1TޑE(c&¢1'#:D-bV7WO1< 2!JX(GfL2^6h`.1W{tW{ޱ}nR mE ;Mj:AY`I){9HH̉Ъ0Y[eZ,>?DM>E%cIk}Rt@JNiE\|~B~nEޭx=ڢJyOdR*uu)˃QX{?z3ang'#p<>_~i_yq3o7]z/Au.og=^6:yf~qc>{/6 o:~ki_m(i9#cTfXp(ocn3[^>EK6j bZz򛓉/BZQ\y5=ը K(%|'im&#J&ZZʋ~ZԶUlu˛;Ef.uo+,iKYDK[5- gq1umg۞Ju#֎-G>]fGq -k^]{xG.kGˆ 7ᇮs|}^ֹ-/]ŻK51l1\*.A$Ts++/؊ިPvoԁv˺CE.h4$(vfTqჿцl$r8,PzkdHi4,=YLeL?EOs.eʊA:O-Y">]Ǎ܌g4ŗnoڹZ;E$8OpeQ-9~-6uOxsC- C&YDI.Tp4B-v}:{g_ pu@ntqI sR]ΑKqyL"4V;LUv*O|#A3}C:]B6>yaW"ZLb?}Q6뺋R'\!H 7[0 XFr=_HsigB0/!ȘCUv7k X d+j=(=4Af%MD9.9`-D"YQtى8 y(,H6$C=2`6{sI3dɩM*LmC[  %_iWǛ[>E Nyep+ wq3#>:eb:]j.'gƪ@&rIzO*$ rJk BAGIAIv+xqB޶_Œlv:לo2fϖWl]}zÁU(UL^-ӄ(Kd:g d1tV-Yߞ?`9RrZ-pPNHыW8̨]Y3hMH:h^ߦANێmS_FGOۑeEny ҭtcr/v(}MZdѻ>v_-d:%K:V͇mPyfJuR;n6~Hん#uZLQӍV6Yha~Y:ɟ%{+r.$2) TʠAg,2r +8q'BJt$B]NDd `1 F&sjȹu[ R9=ضvΎϋ|5GpxO9+>`֟??cORPyԗ/hz (M^(_.3 E o<ODOTt_Na>?by#bdX%L ,0pSRHuF`3!%%1rX"P(R9ߔ^S]SzYyaCը$L9d9aΙ`PŤR }:;76(yoڭ- W<]ZYCkGmy&wX}& Y.(!55<ʠ)KQt|MQ>(Lo|SF댴WmT__CѣE!G_r(*:N.ge]Z/=z&%P3h!2!%-pڏ' 2Q5i 4XIvu]!R .agg2?̕UfIYHd hk<0:D,5T*DBΙ1OUŸHC7k*~t"wԓ.\뢳́r=;9!3AL0wd*?&ڌ93!3,Ed 0K&`h%Jb{, 388 3 p&jGxQS[fϓV&%-R{BtE0ECLpdUt[t: JޅT Ykäe3CN/ܳ2oĕ;g yʽP ~2dQKiq֊,I23ƹY2"Y"@A+7R/c$cDC־kN6eUΠL@CJ>Y*A@f6֨!:&  Hg4< 6*CБIރ'OLE~e'M,=z&He/t[yicN^U[dD226d@&h+ȳN}G#dN BW{FΎ;U!=L0T5L..`LuƅǚĜ3BT8[-q98QŲY)l^RL9:a6֮WO&md8isP*V=+v#lr[|ZoMy'gmnӰR#yR\O%'UA B3gO3&VqMN SNl`)BPU5&DRJi5mmʒ R=1H.El &͵+TmXBͪajgЅKJfojl ޿TO> [ےz:vUlrFvɒ.!~|G1}<%8/5.iO YhwcLcI604 \$?0H~a B9F)UIl'2!J4ޡVx}ȪQ* LZɎ܆>uΜp [&#g*j9 >6CyUbraNn;w?i֞wS/g]wO J'~uOh猂!(޵$Bۡn/;,p 觬"yDIfq}IXoږbdY]bU+L*q+p`#B:toEN$h2vh5Z1:@QAs@t!)KIwPP#gߩg}Eǣ$*0wKn3()* s]ZcYJ"s2=2i2W*f2/%a`^=ا:dEEU45xW~6śiå59D">Qgjg)Q EI@(Kolp!rVn+ME2.~*PÅm0%6½$nj˭o[|ߪIOOxhĆfKW{<:w^. f 5U>xf2 |٫YSpQɠeܘ}MsRA\g+=#JKz6p\Ja NpJc;#=t6pegWZ!N pq%š p=vX>.\G-UWQiH W/$J/9ܖC#(UryIٳ%_]駫F/8! p?o>)&US9ũ99!]^jS Z.>xܘ`t:DcVWL(ѥ)@C~cgT9;kuCO|~SsJg;c5Z[]Y9R+R3Ewjmw1y%G{|y ,UEf1grxJ"Vu8?#&|O\ŢIJ~Mrim-cTV\D.WZO?EbH PV^ZvRf ։ٍ-fIEJ컾bb -"G|1kr]jmIA-DRL(w A et ;&~f<10~n^֒.Gruo:^0l`_+Z.p<7Jϣ9q%Q CT nhNtzc#6WJ\|Rd%t2Vf2h45$xsW%F5%{;Om%rsv3@8\! -%& 6szȋJ =zdgx A`19AÂ#RJ,7j,$ZE(\*=t0,Yydcubr)'>!"L{>z׀k.(XL9El.1J5å6V;jcE) `†tTbIL94a>ͣEFE͞ZW(jng2emvK.{U^b>I.1nu*. b2I>ѐCZOa:ŖNm6}y6e-g E=/`fw7\y阺;q]+j5gß5^[2<ָ&._m^mG) ^Zn+\Biw9R%rԟOR_μ;SrMTKC7{mۻgMD+}X9A D fņ c0PS hwDxE^9!\1y sOelGrS ǛɬfC;;{;ˉc x"Y$X`,x!IcK -" IZDDH\q -0<0p+Sc>hVNA!2q)]T1y w%-@w;xGݽ-Sܕ &|GQ:&,ADh$ KV6col' <ڷгzZސ clH\DMc0R:/0Q@)R{N6-4Շ../ź6&8g/Q"8n2wٛɨԸ-6k/a:f4A}~@ŏY>fEW\NXYM(INC)cIP)R(E'/e5IJHM9Fp^^ן\O}\\ؚhN9Mw`pCBkHG~i4WsFK4- ֑ :/4? PH'}qc\hnJg0j ixS !YC_f4JD.kz1BhQ3[uߩjo40Ȼ- |;)N1_*cn*t؟LS)@Egjmy (ںἐ{nZULR2KX3A;/5 ݳejݡؤ Jf|1f޼ӬbHND4hz=T9A"q*Kw&CM1rLDŽ#yTL0J8}/DlmqB'mþh>95 ɳ1^X"`[][/jδch4Jʽ$Zkˋ0~ 8MonDor{>ހ^ tx%“Ǫ3+PJfYƏe ` >ɨϫzuf |D+hP_oO<ӓ˷%'T4TQ>X8hFʇ`I$h@#T\YIājHoT EJ'0hAAY&Dn2#$ 8Ӱ9ەЄ'dyhr䖇5Dh尧VgGajʃY׼dvWl(Ɠnmz,˻;J⼣]|f ,!dElLX X0VrUmH[9 d36 V$hMdpNHcGAja/ũkMcP4TJFDȈ&*HΑTT!H% MQJ@̇QSNUP9L *X8oӔ(AjL+M5Pӏs )mQJFK!J,DxK*GgkUT-7vRk'ͅ?)鸎qFiCt&Dg8-aZgZf 9Aq I}k'*v xmf*pfV3N-F 霕HùԘu+myD8)8kE>ܰbwZ:j\^R U 2s*H`#1̋$HLjţ9+c*NA:C4,):AEX?eyԟKD ;$&l(`YU `,W;3+4!`N(w(섎fc @Yc;a)6x-Q  :eh y`5HFA VCRbXI8FҎY | \:^#ϚJ"g s4=%+~GΚeYLaQ&ZK[I:=ؼ)M0kfE!HI 斄^p!wiWx hdYDId`qgS,b;|0ſGxeQYyƅ:$I%":ȥs@<&Yq} *'LGE­6'>L/0tх:Vu$!q N"|{F.6JNH26[0Մl}4(*zVѿC?_0ψ,+iF_ف#UeGjԮDDdًϋ)ztV8Yx QKdX9pa 7 H,yXN3@d%:Q'eJ( 2Rh :2㞣 x=M2Ehur*pJ"T*jyB:"_|+|{Zytdepn.Z`+phءgBG z2tDxspԮGe|XDP9Mi0ɜD暠kP("`PcP] ^P0cܻ95zX;~^oϫa8Yt_ep&><*XQ2Z[ xdz .Aʴɂc\+5ZYYroSU?Z>^!E/ݜAoBʞyC{BʭMwiFY7ӱ`x3X)s=R|8,6cH,5KZIwt)ӟoA1:Ŕ^M7eAzzceMgkQ*#3eΆeRII&G? +1pQ O.CZRl^ĕ}Z"7=OëB7uLSulIQ:q(FX+,40AGuf#P䬗-﬏%r.1t`yRM`ʤ(R&+ !J3HnX =yTnHNR*DR*'C$j|&3-Ǔѧ)ڌ93Y.I4`d.7TY:;SL8nWřpD8gf/ `Jsy2heR23ø7G 3z'fgWEnȃ*yxP)Df,'-2ά&z垩ѱb^jϽK7s_0~SvMZX U0aA PUP.Es~!&mpfV5WK%@ldh,2[ε]j):p(8A͒XrF;@* "П>CByv2a}(Er;#C V=>A z?#F̼26Dnñl{8"}BN k{t"'*qSNRʽձ&xk E-M]Y+P(Z]JHds DWLJ;56qxG{~;\Q3W;:w3p͞.l@"4eFM !4(h@8yZʽ/JtO'[/lϹoE鿰&kR=FO@L$b*‹ Tp쁘Fu^a &2->=~;i:9|Bs`e4"činj9%Nx2:3]>:9"9 :sB&'y%A(1B&gbL7dգrY+`92Cp |,%GILcNz2T BW>*&Ύ U11Ѩ_1I L& F$Yg\($)hrFHd&}Ux.u3N0'ZI+^~KًX6 %Jfc/پP騍)FVt5|۳Oǒ8<3q2*8oVߗi&Om"{÷9.Ik=RS8^]ي/%^UYSFI&U\AE0є_BPU5SUJmmB˭(h]sMNk9WTmXMհJ5]Xmftu…Dm߮iM24d;Z~yLc%e)֑d;(UHMXHNF!Wr+`MM4$#\2P@Č,[;VK-qk0%Xvٱv`7F(s`*#9`1. ,4"$8Qic d!2@VtِI,Qp$EP%ɨ~GjlׇQ_ؽr_zp~,W#Qvӈf|hޙ XRㄔQke * I3FY1ml@rpz-,z&-d572ɒ`7NJbQVj]#ޜe@zq1:͒}"T֋N/nhP(*=G"4N:JR)oX-OT0aŧЋG5;Շ>T'Pa Vؚ]Ɠ|[/`4 #k?Y[)Rُ>vl\l>O>_ ͅK+ Ie]lXҨMNFp@`G|4Hvt4H~D4H~4HI*Խ44p I BJH2Vx}@U P( ma@4ېGN "{+3Ȭ\e\]Mics9~tLGdڹ^ίgBR/S YdR'Z*s}ӶgjZB?ЦnV VF4VrjC mclCd&C1;V-R+FÍEmPn#MahLԇ/-3oZ_kMYbZR *0#b$_o;a   whXCkFx+Bc2o1;23 GY-^vۂ` iICBZZ׍Mv˯Rf[gmvKճ]xFrhi'6UZ?E,oeEJ7Tm)LKos0Rf?{J6Trbj7uo o|vCK%7χB>__K%meUw_]YG+6h@؝mY<ȂGd7WC'xX$]fKՀf%"+#"wBf5_ =im< 昇4[Ӈ\w^ZnmHRupZM[ԙyԽצ,'ED)n;K+Y=YoE arڶuF;˶ yecBxfTQih6x)%F-sNo=цt4!rƝb#(I"aQIcK 1e )4†|I.,`C0\ rI3~+//)"p}'x|onewe yǰdXn <砒JybrI'D Km"*sduƿb3̮czUOy46@ZKQQrI)R>bE;:ӊp݅aC٦89Գ)P,ŤђQ@dAF(MZQ4TwRߚԯ7&hCG!}D m ~!>^jCg}~ Pǝ}j3-u 4D,HdTNY<&YoZ҃' Be ŌU׽MZ^ʵы?E| |}CV%E%\D ݔo^0 FiV1IT9\~LomԔ3Ry?\8?F`v=ans[5W㵝$qrӃDd~ǵԯzxuT cPH(X~kkLo\^F-ȴG4ǬXfJL. |rZSuj40V=>0I- qVnbo8]Hx1web>̦')ONj"D>1 ٶ r^ȉg`:2ATg,-^A-K-oc25vVlRget>Cŝy]Śh+y=zr7CtGO*2jG lJ-p)rBJpP \SvRi ==hE۴t±WxN8 Vi4wFE噊Jx) )@}TQí6G\rZ;51D&AZQzsS)7758o6io694Xu+e(ϳN,2i~0e4U-f+/|_)(feNC;j2ߖy-goΨ2k2VT{O5YYAl3r{:H"q@O?2{)]J251+PK<ckˬ$: ag'uvRx mB NV )MpVG')(<* jhEq^k{ߏZ!OK+2L%ep!B( ҅A2,.0gQA?r46ӭiw zF*.HI/ @NIזa"8/ Ak|OC=Ք.5N#ΨE]"Lñ|PSG]K T >8.c2x28z"&G|1^xC. ZS@QE@0AxgKP>P>jKS$O"NPKmzN^ _enΝ6%D QbʲmrF1˹TX1 'ZsGGV_u]x`ypN%OAhô:iC) k4% (. 6_,< [L?!MK4hVjRbE* Kd QY Pٱ(tP[A$F[/HMv`L!VI(NSY:AjW=Yb7EQ'2oJ$S5 JXOTm1)>b4:g|}Q:%yˮًU,? F>>'٥|qe TLW#p<.]~+] yZŷ`巿{U]bD^V8<x׽t:)(E77#R`'/ߌߣ^ `posSGMsxS~Kx{uyjeu}%Uh fJ;i>,҂uCP,?kPm}Ns܈q*8,|Yb[K~ЯȟʑϮ^cmS޵,J[Wq_MphO\&Z~t\>;~ dKWN)+)]xVl9ڔF.uyٟTPE%܃"2-|Ët3oT 2Ib^|6y}2r;\*˪bY2WZk;ÅQv ]"SwRL'ui(y_jm4VǢql[n #U> |8-PzdgJ`TVxɶSdQ'z~ߪ;(3ʤa+AzK }ӏ܌gQW`i]b;D--jZl4c56J <ȮH'#Z');JrqKMAg 1U kp,™'Z:B Gp4xk"&DHFE+=+7e`9mtF&Z;r'~5ά}WDIim$ C"#R@J 2y$$灂T ]q.]$VM"I !#z yJn9_9>PqJ;'^w Gl;m}i-8?k&A]Ї0m;ɼ?:B'w!ة|_' ^{xJxRZO;U'bi<^ 2CjNrT  @M$h%wv6Įn2rn޲쒺y;IP[֞jRO +VP-aZrE*(4$)Jf) ͙DLIǐ=՞J1p.6Aa~탥hcxw7,v/ aX9>>߹XFDvdZĴP2Ϟ&zl%.*K`d)^څl;1>D ~r>jRt.$N\G qv,'|MF6e0 zJ*Tƻd")r_۪;[DE\h2aD Bw7Ey89̃{g={ݬ!fE3.-gYמb.3-NnjZ& "K!AйhťVE,Ù/y`杗gG!T=?Xid G aJ[(SBPJ$IǢNRrJ9χ{;R2 DORV֋+G,-ڛhi4!NH2%0ޮ>!qS֞cne&ڲgS,TP)XvNkױ`T,LްA.4d,,iItB5Q)퐎N؍9tU֪qۺ*q3ht>;P818WXD2tgC$ ļNcSMV.BeɋX8y%KHRQ4|OwXfkx<<pd ܉Ju]r^wzfW[[11D|`XK0mUap霳#֙ %x\#2%+';ku("(ڦcJATOD2EGN1(Q2nཧY dRaaΝt)Mh9$[ L@bA@!8bidrf(1a`TpITCoFu%@1@%)88~8h -]sw}ݕbك͓ e/!e{'_{u^{8ɋ ~Z(tk+6NS{IŻ%m蠨FP&TTE\-9ډRR g0/AISP)i׶g x||鏻<1b+%:Nf(d9HlŕIsBMT*+gBX+;podQVu$L(Ԟ- "J2s> %jN'elvlf`6#j eFaZfi%ks1eC14lϙU1ZbN‘@,daĊζ`MiG( XTRm[`ިLH߀qWq)"ځю8"C3>EfE.S. ?.`!AF[HĊu(%axT-]E:#- gRICui,>38aW\v9,a`\G\|УKlQFWfBl)؀JKI'g"y^i@8q1pqgC͎MC ®20/IyuzQ?kJo^lrFɒYEF\яdzqt}T}4txBY-89''uZě Q=* Q{S졉ka_=4iwCp, =ha@Py^T[(u} 脺`U :BjZh`r@1| ڝ2ON.K7{^}e u?D/6m>hNZvzz\/SIO귏mC|t@AвltUwUб^.Y2I/XYz.CX_\"+tIRE5!(І )c-sL/q/ tۧyt;s9%NY_fm%bݍ뻒z>lXǽ9=rsJٽq7qh"I9xsE\!W!iy'x8Ds9xo;*u)&)*A-xN47^ʻ{?~;K.:u^9AKieH~HŇ'"AzgÓ FC )F&a9lv)䍅[R҂|Ҡ$HuʪhacH :"kq4%PKc;94 L$H0!3 orLm9+p>xO ;?=5N;X!ѣ?*|ܜ2ђ@ GA!,z(I6QyQ9{{ {{o-1͎泩%ꜽS[gmT)DSްsu+Χ;6[n [Wo=\k!mz?֋&]wz!?[!;if tM3т#}wQ|q2:/Q)lU$9wn88|s@LS*q41K|RJM^CmݩG?&L/(>ȍM7Z%ݻrz }=he:Aŏ 4`٦cB!Um [>wU&f$Cn3'? }Ujb^MĬKʮ6岡Ծho4{0ݲzH֯??v_Bo:orӗo`>!Y嗓lF]cua⇘&jvR~-^_p>ܜ׆HHޜ,Q.}{sTew9Ӌ+1:"X#,Ҭ&ctֽw{ 浙SLNϟ?/ް4rq '{TO~h9/]h>^/?V #G,a+=#hzP"iOh  ar|P,Ok1R|.;tD OgtJQ&旎YTI(%'IW  Gv<,;|ͮ_1DϯbT]_ "O2m7^ EZβ۴|*?AsXHxq?-b(~6CI`bYy^\, Xqqּs"U?Q$ !tU|-vWTD&}9=:l>-`[g.|qO[OSru0ic̶pmTAAp2hܸQ~ӚMw DWܙwaCX۰2Q]z|³=VKw&CM1rL 1s W"$R )Kb5"IN,I3Td  ݓ3tr;l/cc! Gq g3A&) r< *ˢxZNì GefKq_qwZ:&Nw=eFpuW qQJa0 F'e}}!3 >DK-Y ~AS%rx%0;@ GЏ$cVގրlyь8I$"UF.2qk  # *=N()ya`"ZdРmPV# $Ri9I1d6Bstyލ6\[ ҏ@Xz"JXS6[zң04.ҭ_$rҢؽhOOvQŴu&hŐڧ+"֒&Vmf=c$9;FflbH,њ,XaƎŃH<_gS7LcP4TJFD%cHdDԈ HTLG#*!lhYβUr}-Cm FM_@6By@B@3#8@BW LLS%I0)*( hFJ[AE*@H4K)RPȹrO:Ig{)V:DgbPJIk5yf a@8'(>(,cĖ[{8F7o7Za& NrHPNTjƩ^q%H!i8>H[62DOEȶ‡3 _; gjq{I3T8ϩd" \0."1a ģ3ƺA:qȶA2KiZX?m U.K+H@iK&TDJ;i&l"`]IS,6 z;eHNG74*nk>xQGu~IjQL_# E{8M.Eh:z֟{J-6/8/_<pP 9CaeQ”Z/e=Tel<,7p:.$+Si=JBJַL w-އ{*^+֊v TCʾ(Z?l?NT{W|2ʷ(b擪 h{uI\FwT7.Z''EcףOqmh]f}|Lr>u%NJt#-Lѷ|9GG%3|ȗ[sRl2Z -΂/BAj5HFA+¥& jS&aJ;f2Rʃp @X/5^#r;ѹDNӎfnFk;usDqqVhc/ KM)ifE!HI mbÅAw($-Hw$JitL*"Q* )贒K`@f\>|| GI{gO`4[uOϣŬO5'ㄫg7U5%{>Xg~aņe$k OarKaiIns쬹tٶ[9bx1cX3 ͱk6ݷ, }N]Y` /%fe^TR|||KQ,#Iw^2N7t^ܚ`s3~L.!Aیa z1r0aV/L&>mƹJzKP+eN=ݺ!}E+/hd!iIVPbT{І|<fz8M[-gk&S#Ҫhڅ ,)RPbXW@kqEt+QBnF v^3^vn$AJ[ީ֘߮džRZbi^X{u ;RZ`G!=x'1JA&}GXۿ{X*=0p6YJct 1;O, f\ cb,;L{U3'"53}W:2WULJy\=v?zVv%秳!JcF1K)`)$d}Via~u2y?$ܘaʆ#6Md!K,[%@m{ݻgF~8)pBI3aRn#Ai<:4jN &pR`^iGjC6^ wE%KRU6 i,< $pg.\E.]#*aҠ>~-bvvfu0bJ,뛅H>HԔ1}o FрG!eLDz{b vjuMŒ5f*[wppM]^V>e,=phŸQ,_9Cieo^=hO?~1̐Y`.6^'t̪cˏ^ &uD%h EeZL0&>}jWݝM=E/,6׺KU{ur_bݰ8܆"!gaX̠]Z4q>T;GEqidd__71|7}Oa`5͎f7ݿIc:%[K X`i=%E8`Y Iɡf2U?N2TQ8% [Yjja<6&U0ϲUn3C{tʄr #\I R(∨th)BIÐg,MƑr!`1|G EmՑaYqGkZ јl  ]^t=~M/m uyJnw-| SRW Ú {5Ȟ@"jߑ=LDLu%oV2U:\α3GeEe@LҐy"s;4-fӽK.jvpJ^FE= E* 6F()6`Yb Ql=N;[B7zf!% 3Bs4fvAeB2ƅ\)W]}sپwi]HuG~=kG< 5%p[#^?{-(|Jk J0ڞC5DIXʒ1*J+.UB?zs}*z{dʽ^}4GgI$zsBFs) ZEx'eiVUgz8_!cK#"2SؙySnEmţn^b'Vf2#/+s&d VSC>'2'Rr,.I|釁jk3qm ?$xhlfljO #HV@IθPYVRX9+gdM@Z6qJ8 ŌLp3)e C3a s-U{K"%)*/U.+&r)9,JbRʩ`SAM)) 'Ǻ fۥJ>'[ xI^J_Z#c3q#c;_6ӌ ,\IToSO}Eq̓??󗗓w33#|-Dl:E6wF^F욈$DJdBv +8< fD@r{ U ,JEDc[Gg?b8QsAfq,j7KՀڃK9eMQ"d`,>ؒK ekXDRR6Âd| #3d %0:ƚE$ĎcE`Iu67fÞ/T`<Dl"nz@wi|C$3Y@'R֐ *X`B m۲JQB17ZBFmA3!h!S&it12V`7율[Eo&~D|Kna\LKEh0​;w`Ee1(lVhÂ(Rd*o )oV&>1jSwҎc!'UÖ̾CYF>_s8(Q)E}0:.tҸ}dܽq AA09bƘ2v9%S0/4 R]<0Hya !Bь<+{RCJ=ʾ}'>PEm&ŖX%,Q_٢x^]q߾cXi>nSw~AtOrW5,f }DiEtJND2|-3Nb 6xGIIyoCFJ``tQ5Z0lT%P߂7YOBt2.讎=Z͜b0Y>pV4gU/o }7mr_fYE sŠr]"lYw3] >C񄇟{GQA(=F&9rZfJⴓb]&Ą68A(6*PdEVNHUIBM,*lN^\By^ߤ{d;sD y6O"]=9-éT6䛯i(j!C]m(f->Jir]}lD5ƚ}q*'_0Iýt^RmWbt89٘S5 X*(dBR>KaZ d(ͤ44=x~ak_vSzUwl ہHƂ|t E$ ]O.[t$w0+*H3]%cv <&O;\8ݳێ s{ҳܭ3]>s[z ʐ}TAX`hTq~,A1B7R*34XN>bqx4pUUJa/X&NbsÇqjw=}ooR;jKMS7{~}qYԢ̏S_9я+ ,LV&%Ͻ~Fh2 |ß;OwYؚԢ=/*9_f]\]_ݼ,>Q>fo~wq|zocGh:Ev5s^?O"[Lv_0@Y o~^L+|^ߡ\yۋKs;Zj[l#mc5+1m"yz˃OjtByCzS4~ԟ|!>n7%Fp>h"Oc ^_A \Wւ*l /jT,*&yY P0uoz [W^-ng!2B+s~{-׾5}wyuo&Y͑Ur[q_CnqO|ٛjq@qC-{9:*vyprGN8ߵYja t:?^mZor"{-Ozȝevqƫof%݁SNOxCm㮵pw~G*_';6uy9=Ё":5{0x:K t[qe!yɩhnZ㽔MsG?wΫ8L&]n\+_xk ćp}#g4`.qOޠhZEt'+4}p7?t\u|u˻w.5w"?_ n>?g'Ϸ)Ü׃vn]^j;>#h5ZAo^Z 魵nӸ\I=UZsZ2N=mf5t^%_GYeӬXOXLW <+Q=~*yw $S:q8m˫"M-z*CGt߰5oBr'BPm<3xjGM|̾?m`Y^P:4;tNj.n^fHjs1b+ @Q鬵 Sзܿl+q`>B'P 0-4C]\7 TQ{*ZϓT:CJä:AQW3жbqǒUs0~9^w9e~_gb("B@F.d].ڹhZ|"Z gJ ?CL$-em^4yE[cF(m\0-zd!I.J]HS’Oȉti>8iω %W]{[]2Mou]Q%:VuT&LfݢՎͬ?v`nH҉uKz ұE4?ȹ{P>ʙi[Hs u5{jVlUf$i<-mQ k(W>+ܪ\P|(W>TrЂ3'`r,?m&OK639ߙx\`)[j~6bOv'==Ejv9@ 9K8psB SSr>zԃV|ο\mvvsPU)NM{Os~/8XҘu_>ѱKҘ@SjKe$O$<,.,ȜQX9T|’2VgTEِ1)2&1J'g-C*ElMÂL>gBA`9:Cp |"%ǒ钤/;2AQ㰠f w'dSLvfxs.ׯ4T#H h:ijZRX9+gdM숚jadDq1c?R*ʘ!0t$LaccCm&~Cͯk>E7^jT_}}nݼ۴aY]YiOvWrgUet>=WoTXm+:yg ^rqlϕjdTp^#E0* gLtDL HJ)"ރ~pO6EFV'jID6-CO5$K|2RHR" *Ͱf Xe`Jl׿Wf],^_ζ_^N"O J9bԑ(AFP4*4bD$!R"W_%.PɞOJhL L$(L7PE\D4uVG05y.L;EmՀڃKťtA%00{˚E<-EY|%SY'o},"mxX}$`̞fQgXqd/|HL16g?~6O5h|q.L?ED"KI yg`O)!"4Q'#U"f@[hldqZΰ|[PLH!ZȔIm̤ ;'BlVGwd8uN{ʹX\Ƹ.sG+&_QD^f6N;,"EfJ֐ƚhe\<.~g+81qxx[R:Ƚ59 I(7 ښ![=ǯ_^=o8^{<0KVtr=?5IȎO(zs#v=_":gvٍ<%"Y A ؇dy8:G-Zlfe3m`.VS"YU{Mueߍ~y;4Ŀ4(gVͼa3?ϻ5|M1!a!hF;n%beAcr I`g` )mԴ .(٤,,!YHCyzgO+}dӡѓK(0(!M|^(wNA>򇰰/_PYMX~YxVFCW|^{i $B) ɜ|tZP>Zhf_qS1Fde#2,P",X*eɠD@7@c'Dc<$c;gob1h^{Khsx>OwVjKqxd\эnNa;YȮx>kݖ&Ԛ\Tå nJlyLar}q"ƃ-g!ub}ըĪS`5 ,(_XBIw<^I^8:8^Cjt90 sF)kx_idȰЩ+T1M `e wX\4 .OLB&蕎,$EIBɱ,l>%7=%5|Ǧz rkF͗΃f?v%ce5ѳ>khߍ|wy\]XFv ot[R-Ƿ޼y}G $t«Ϡ\d\c֢Vk|zmx!3 o/梭%3xm޼_xYn8HQ۱%7on [i*d;'-SI_X1a܅$k &Lj2DT_Ιi0,qҚg aџck'oٟAN4 `>[vqC7+>gv~HeVm,LǩTpP5 Y5JHYEu5}3{16vRSLʠ=|<z۸"5kZ($D e,TJe Ќ1ɠ;Cz0aS)NOBYC4l(s[/k#&."vPf4]),.)#ZG\M//ʓ:1`ac["/%C`lͲJ,&:'=fT8~Р Ox,k؄|×2g@kBu+~̋v_&? 'jhR hFe"T9cBХST~l^9 yHe :=sdK*E30MYBϦ?1jYtX ၭ̉~i;x^rV6^Hʜ=tR> %=EJr*":1K#P g-QaL1NYJ:eRuQi`P9ó7ni@ue s|Z.bRo4|ǣ+W|'p^F͛»MG???'Bmcò)fo6IsɱJc~ח߯fysy?dN%J樲CmUD#zI7_^Ul5%O}1Ƴ>-z49h/B|(w0QM\nA_|~E+ "Q!$eFK0ޓTop9[jR B !uS@5F)" \,25V"{~n].t۸ ofWőj4bN~Y#_s$Ǒ5{[bUpIfo -ӛU(HݺC^4[Ag'kt Jƀ Ӡ  ǍX0(jKQ:Gi[ )%84" *'OzLLI\DTcPUF% z`P,^}l[gc8VaGG*?WRAS8eVgӸUmy~]ϭwo=+Z_5-I1Gmk"Z$ՎS:Rv:x;e;2VӺ^G!;=3x.#-)Vv0#U^3׃G+T.;+Wz4ÔV'^Zk}BU_嬟 U[AN$ IB^fbxd1zm>gE:9 *C `Q$KjlUxEI1H MaoY}qϧ_5jϛp>m˯o9-Nm4^=# g=h҇$k!Xh?RINASA5k_lY.B-WY_O?g2:KzqePԙuehJd&(omL6_a.OLV`) ⺏)D3?#sO2gt,XdSe$mrBF >@BM`u;P}oTO[b-Zl}>* ABQߚy "g {`╀A]wc KPb(%d*94rBv裣&y??( 1 %sSF#&-]q1Mc Tַ[\-+QUb3i.9^MP%On 7F 噇WH gʚ8_ғsRb[UQ%Q$iR{zą={g{{>swh}bvv2;J90mФ(I`dP-Sg4$}{vѤW^V#GٽIxj⓳=r% ֜WV#~M КOR^4C=дJXG hWt2 PHfX? lFH 0? 51#r9y}=Ju3{l\ ffZ>f[v]xfwΚ`ݛ_kضw?-~OǫMï؍SpFiv|n&S`?M`ӓt /{۝㏣%[7n{?'qc+ώ^ky'ܦYD_beJ]_M8 Sjo{iu9?^V W mvjҡO~]JZū#1:"X%~2k1BwZ<=w@̛r|ykLGxT>YEo_$+6_LV47@}|S#ꕞ\t[J k#b~]>*ƣbX9ۆW.ǎ xH:j@݊&!9]Q͉nZyaj~Y02*Iy@$vCH$T 8ǭJ*$T3,F(V:d ႉhAAY&D*J2#$ s&Fhҡm,kb3Wš<ʓWnv׼=W܁6Nm~tŃ)aGUL[mV }*I *2,"a- abUF%7v!r6@flbH,њ,XaƎŃH<_gS7LcP4TJFD%cHdDԈ HTL\6Pd4,gR*_luýQSm`P9L 8*x8_0T AjL/9EBO?2[M6`jTD)).i D+ L"-q` Ml{?->%3(R8tĠ 7Lk=RKLqNP+,S[{xbǭ=yk-ୖig ST0 Y8+NB)sVr" Rc"׽勈mP\YW/*|8yf;C!zJ&R9U"`4qD Sx1h!J,[ ^p}%.Yd@J]62$RI'^>`.N/c4<\zèU{֜~Z.apeQүZ} 'ǵicÁʬ9f&nҫkUŰg)yCwTח#r<߽zo7{OO@o,լ4p'379':18/")b$`AQCVǑJĈ28K5j $RZ:k紴{QVG2B ϫF9oyf owp)aֱr> s{q-# dRKJ1{ݲQܺFKA q gay9mT;ĬW逰 _jRsQ ҧF;R.*8,vP,%֭ ENOèu)4 h}Ҁ0ȕ{gzn~?%oÌg ӮGgfYD 'jMe &olj%=TM!kV78+c*DIXʒWZp"(<& !q*:#;#3A< GyHE2΢W9'd.J.=PU/=AmRfM! eZ3>hDkR"$gNFΖ?w.֊u b$gRKӨ-LG`VE=eHi-Vyb0't+5*! b Eo/ qfS&B9r ˽)YPEuKׯVwRu^v%슬l`SN/+uMIau <9BVɍM!'U$<(ُY?с>w93Tn͚͘1Mf^~pAQE'4˙g4l656уEN3H9ᆀ&4Ac^Ƭ* Cr'a$ItY$n0"a:0 ŅT5v6r6kl;+Z;w쪵3WiSkނ㹑<ڨiq [y*zc ZF)r!`#12 hG]rt0G:H"9aԷu5I1vE#f]5"ͬi{x'IRXFˀU'RJYBp^bx"0i0FQ+PJ"$T3ZFb3[XA`Ir!MAmh]7'ȭ@; ~4*q$>$JSLijE>TVaD%+TRw fqo@ysȐ-9< [j]ɤ ^̢L>kFƝڦV΢cJ 'Gr:`"bc#O%uZ6氓+ԇ1 qQS,JM%08yV(joFI*Za2( E N5ېE/ d<R8P۹i5`n?'9\#N-dѲҀPZ+#EXVDiZ 1.Jŵ$z76&}dS cyZzL,c0_ԃdMiJĨ]qFV;|>]8;wJm6n]sAۻ0*޵ٿBedZ$Ale,j1I$/AI&idKvelv]]Ωt"/-錛pry~2tltUngg߽\}Wy1[v8Q=? Nw0\?ub>8靈OOB͐VAO!?.g4f4hu؋2g7xny5sKBm,KtNcOThgZ-\xc~:G:Of]R\{INOm7oj8L9ef`4{+cR~ם3,od3~ϫ.8;r*lXj"mhL( {};l El> ZlAAsRL0DrX7wY[4+s k)׽}/LaN0L$@/9(qpŅAu{-K51Vr LL`3mjߪoldzypov1@_|ɇ,-:?[҃;_X~8vf'$@dD\dDZE4mI#!Dn5f:Ny͑1kE(ǣXY&{W el'I˟rɓ= ޞwa{Q;;͝76FzClStk*Sڰ7T4P'.8̧/쌖.<o6/za_خvD׉{Y xv]'Ue'RQF\ l?07e" !k_@egC}V:lf7s$Q}#r$"ЮnA,"7+#)HÓiMAH-]yrNI^{-kWO@Nq竿"W@;Ęr*F e-ENҎQOX{;m]֋H}CCPۄچHY ZR2 DlB_q75\ Ԝ컹JrVT }b Beȵ*9iIk LKk[lJ{:CўyuǤ6.goB*7o9B"i;pd#<"Kq1*L:|bmR6cysNQe%VEs36;VAO{phZpZ@ZX[ۢv'nlZLI:Ų '2S ^Hgx Ug$2fx3!O]0MG3^JP8(x+G) ,eķ@46Zy9exLXe#j ca,|ZÀC*eq8-g6u$5@y4Go4CH y/)`pejDX\jEQ_U0Vr&!BIX*+vPXoX+1sՕb1N x3Vnps^b_<͘U+@1 E%勥MfE8q"^d}ia;6-e䄨G[-ky׸q`<" /`:p<(m9@f Nz- ] &W$Z*=``"&Ay! =;m _д_1 J#HDFM W$D胕EJP*g6|)_Lm( AJt+ < oĭ |X:N~T} yrx*NEɜ/+II|'ӧ~4K xєN&o%!jUW@ @ RXi6yЎ2;f20jxrI'#$5M1(tjFV8-5* ΃h.<@I \ƢN$@pX00",0wϢbt",U@k6֜ß?HefH, ̂5*)8RY yՠ6H*gUWo"jZzה@̤I@ |^3&T%Ta^ \u0qX 9zn| |(VS5|(9|q%(Kd0um1M~T0L&Sn3a0`ؾ/4,E2wy.j 灣d`09qoFrih͛ _;wCP^o* 9QpyВ>? V6 JTF˄Jԃ (0 P#F' \ ck[ k&id jV"bʅs <@b&B|;t/x `\ة D8RŢ4xh%z 4~  ڰ(RVx'#80`ci FjRc5z@pkOTFO D2)I1OH2˧GH:v\W^5ʾawU@`HicM,gfFZV2Z0bԋ@+Bl|)Q#2`pϛrr*9ml!&\K'`ebBu"JI-bJ8$"0@d뢀) Z`4rMp]튄ީEF8Ø`j?®zB_uY˭6 \xf:1D Dg>>/_n8/7* E;dx=UUѸ6(rGF :w ̦y>\\,ߔ6KnzixYMZëڼj/+?hPJ.?TALfv%qwW.fڮSj (+Ij*}v|OeV*038bVSq읕5{'Es?+p*o|18l\~Zf4],llA6{ZKGݳQ2&_c܇ָzgrz|~[]d{?/Qk){&S LI7o-Kv-i䕫h^v7}#i#' 7O̕4}7}NZ~ !}17H|PoA%oT`q!U;Ģރno]wm>qv%ˌgVr-'ݪ[ui'h3E/IY226;5lũyo]!vw3=ۇ}k@umW:F1V7_^ې,PT\Cp"V e,Zn%%>-cwo#}+[ړ*AGWrBǔ s ^tNTEmC|̜p\!<[Χ`PŻ8_shg [syq&܍8^p dlqN†2{*|g9!]w{MX咬v'0O0޴pxt:z1'>f?3Gs6?]Z{}+ 7 s7 cuԍJt]Vx=ޒ?c= #O2uv :7F*z]v®BU aW!*]v®BU aW!*]v®BU aW!*]v®BU aW!*]v®BU aW!*]v®BU nH)ZtTaOFjUaQ]GP! DD@wN ;z'P@wN ;z'P@wN ;z'P@wN ;z'P@wN ;z'P@wN vY!: DJT:Z{'Y}:>{r]zg뱦xE{E54*7/ِt{{{/t#..f痧w1gS= QmA݊ kA+mH }!I'o?xOP59xnY= K3==GUWT$˝hwD$q,!LxbH涧2fE+afD+FΞhskZ_W1`[AlѶێ9{3&mN{M4?66g $i)c{) +% ^H,gR#Ř2 ! R"`/7۸ `xG NYIZ9Iu2 p 4A;vQP 9C6rv $9" n%}[&M]?=VɎb{7)[5ya<0G+ԂGdo1OiQk!,2>(# H˥VRBWzGL ߋ<䡂BIH6r$H.KMP)fFJe& 'R{N6"y@]7n Qp YrΧht㑻.F7EO/rt^QOQpp|5Mt~HB2a4k 3طF9N~:u(>-.쉛^Vlsr׃a:͆or0$wFL}=ǃi. b CIlj'c('M쀍ZA`~^gbF2!kϦ8+1ylۈBSUFTeWQ_jo508cawf[=0M!?_5Jh:o>\~{亷O[@bM[ݟ^ytg`w>χoF^MuONMV.=ܶo_gѕ˓=_bZ-~-'.MT|a"ٳ7_mGA.t}n.G;ӚH=V:P܅_/~[v ઺bj1!-h}d4&󇙟'KN>z\{fVu棲Ջc^Ak3^:ZUD}pWI.Z^G%:+7,Vi\b:)&˓ӇhƋ:kEzHr]D/j*N;BQ\ (L_*Hyֽ03~4NJ.H°h:J@k"ZJ#ӍʳNbOoۻūJ/cW&mڅZjkΔEHsU|}&OӅb`(n&>&Lf/òWwu\(*/GDHWF7>LR :̎o>& f}Ӳ%:7'i3 21ؔLBژ[(jֻ{O&>oy`6mTئ&M~42AM{GǼM zcsEyaŽN^| Dp{nC̵bbWsܙ+e=&\aTPe:֍3'~OvQŴua@pCta"`eXDZ8*sn6=BL!ᘰUފ  p9(\R_4 .o}b,9{YW9&6;V[|jFPOK{KsB@3s%K! !m{D[A +?R`zo?=#SPRs$Z %h,7t#\Vȹ^ʦjŸz;iɼ]PIu ҼN+31TDәiͣgYjB$m#8ā5 q暄} [vjI$*PSfV3N-F 霕HùԘum<"y@ܴ0_-r(ׂ~샼Z:j WR U2s*U"`I&A$&L쑏=MgvȡA2sW%Ό vݎRAd /*6K7Vέ $Ur?Q(`HكMno @X Y^A8'1'l=h r-_D4Vl2Z -.)딹aj⑌:&vZy rja%MŊeѣJν%9nRD#M6/gF) cR"*-U;rQw_~6Tڸa(ۏƬnpcS"ipQP  !sD ܅dk!$s@Q?SM{1sAX IkPi'RRt5Xj4FψYԷ2!>@Ei1au*e땞ǐZbE-^jq\ۭ{t5{fd[(L?;q&Y%|nklڈH)_qr7z[NnrgQuJa ,5bD Mޱ](=QUې5ct=X  n _~ۥvsH!s5 C3=[С5jUD8AD0F݄W9$ $^ E.olX8"7`l` "g&bs`58fh=G,t {S];))tG+|!i jo䘆2R^; f YQr9f"u4AGdz4(ap~<>d|gFq^ކ߻1;*)M0Y(; 3\;㛍Ajrx/IÒo Ы{(.")F"Oڿ` A* [VRD}V&AZW|iπ;::e7qN ny.'ͺA:b;vMS[zlWɓvEgV &*(@A@ a`KD}JDGd0QBU4˨mB2X9M:DHr2˪Dʨf#gv#f2ʅxn}uO8sɆ}jEi"A;ٛ LPbz}UMvQȜO\9l1BiK$G6JрR0Z q #: v_EKyD_9en(&1lڍ|lcMY:qP9ŤAr.ʬWtS \`Z{1[,} *Q[o̍xL庈p{q z& c_x ^Tb.>T'Eq4ɾI>}oZzmLM6DB8o5*2KiR.:톱n-rɬ3{<ά34imЫr5ѴT욖;TT*(FrMIQk JNAtNA)))yD+i;8"JX"JDӀP|6RΣ12ŬN7NQmGs++d\XQ0A[R.keGܟ)$>& :zTL¨O8{2@.OؑJzd/X1zB fɈ+ "D+R@꟏{.=?AɄqu?j}qu/*:2q!x/^zɸ^iW сrq]ko#7+&ؔK@cw 40yL>cM˒# J˶dRUyX<<$w]\,7j\UoUBtb" l 6ۏ߈'CNN{rA;E2 4eL3ʶ9=MM3 X@g*펪hl;]!J =]F@"+,9 ]e`BWl(K /0']et ooHNWҕT.9B0) f3thi(i^%]).F3GP蓋:ETj#wZLge+(V1rqzsy>Ozh = 4GW dAae#Fi*Jzٸ =\qRwٿ6K9 5.hgeFAMRɯ:>_ۼmoq7x`b 'g'ql ywpPP$bzAd[S3߭.VtʎJkiTrӒʧ8YFwl1q>Sxp%P<ܞt?9B૰dD,wקwsq*޵{MlhU3۩a/iO E_4PH WҮHV^!JGJ$t4Gu2ZINWe?n*JKuTCUF[kQ<^]i0@lStp]K}ןU/OѳA\,CW9yh5;]=iؕ|]ɞz%m]D *@AKu%\Jh\>IpY-#O@=M?X1׃$S!ѵ3laЀq\~u $ 赴`qtyߖ{((mϒ,ݜ~~޹" $F (̊hXqP%ZHW_Te"hai)<1` <0>k 7{jvY],kvݥ]WN5Z@R -dteh*J>QC phi:59]eBW^ UFyOWS :CW]7\+R 5 Ct <3J ]!ڣxJдWHW~+ G,yp9 ]etQ-+i U]r0pegжw_#]| tq3\҄cj;]eHWn+:DW]+D[9tQ2՟ԎUN OWر#u.l:Gۙ*K)(c1Mi@ 42{0ٲ߷YW̺"sJ`,~2>)z}.Y`zLZ ae1#V+2䡡n%g vWy|8ׇ%r5 gE?oS7&ލC͐}@ʧrgi(]W< &hǣư2m7osx1#p5^kYz%y11K~?eg췴rֿY7yƯ&e_4o i_Hn \v1kd]k˩X&4+{ǔoi`"OgSY|SR 5?LaZ5  4q01AxGUHڭSyF >6E>.fy(Woh2csEyyr5oܟcmw *e4y \P$c3k|% oLz꫺yq2ԫ?k?*ؚ?nv+2rym^5k-^UpmGeѪ o\)⩛E]9dgg(34z*SvrTSX}YK+ DJPd3"|bscc I2& O5Zks gJ*=_x; ]a `ѹ{ֵ.Vg{=Gc+o˭K3yi?ed.қaTS)묪\>򛕣U Nܺd\slwϷ4(~f]9G#yqY2nQI<ԦggT:XDBc'Պk1iR:ڭ39k΄LhΌҠ^;SJe:YT<IR [VP+Qie&PlZXh.P፧:c\k\ "8O` yjXg6i6K\\\_Z'2Gȼ K85OT)D0HF) 8]Lh[!y.4JQ#EQ8(5Njt8QD8.(CS0$@m18O-o)i) ktH8l'E1Y-v-T3PEdžXtJƤ䉖b3Y(^(VM3VHJ;OuRsǔ1uK@4*Y &Έ'PPݮJ(b&!/Yb[`<B)Y ܲGjT}kQE?g;F[{~)mj+>2g8P.K綞}.GKsUt-I{ ү=N MYwlaܖTGTPF;F.2ҙ3Zm_#(݀1>)lt~2*dBC_7cR,(@*,Դ\Ki^֖;pٺpբpՊ֐l)4&$PT B)ƃIh!9~4 2.\=q*N)} 18O7[ ]8+hb ˛V~@ʋɝiIU`2~Xzϟך,f7˖%{շ}+7wEjk{O'ĦioNl훎,D3$FbJ1`ԂJt@"bH.W!(D(J!`#ZXtH2]`c)U1H*TMHRM3ccpvlL4ƅYFυ ׈rWYmT//zz&nzÎVO{\zFiNʚu`Fi|T;H2*C E{rMѬ,2% G(`]H"NuT6̇yK>xFydFl>eD0#{F|M(U5*p_ bR(EY9ifД}(S-y5D%UV|O d^O'Dv=jBa -wS'MɞwAQ0i!Y'Qq"G ;`cY C DU6xq#Ho)(@ꃴ(!TIrdmFf ^jg}5@8A={m#*o%,)j *& Sp䓥6x&y{Wj Fǣ ΅yF3i߅ TF&UC$ B_9jTپ,o,ա,rk%I+6 dF0b``=g*U.k,8ڭla#%ǘ/8x9h"10%'n- H7Miƕu Gk}Po+MX IH+]<*C|prg Ab\jY'Inuj$CƼRoer\n%$t*dze"˼"\ɳ1NL- (a"KEcȊXZRjOb&Zv>8^Ǩxo mS$ )gޒ 8oҞ$T ;hS(-3PPS3@F!r0Bb²;S5S+;?󓞰c5e.dtta>03+S9$KK&F S(.:ek<딜{%MVt##4t]M3뤂Px3Jko1%W8.XV""Dzo _؍|z` $=8Ǭ=X/XHɊ\YT9ӼBN<1W(ϭqCe<. ըlџ)&Uݤb{JkC$LdgK6͒MJ~ Š_Mz͇AƃZ5X[%pNOk}6[SJpo%+UOG~niߔrig j޽,!u6RGLDD)5c$\t 5D?yPH}{\:ωdgt~Ǜ?Ix-0f4AeoQ/}C0'e~oS3'jvGڒV2F>%s|d1(%ze!VLES`ugNmԻ%=cK[:'5KA=̠S9Jcd2nD(sC5"FjQ/ǚq?1pp~_졦m? Y pɦ֖ۜ+&A|t1iENd:}ǎ۟jkJdi Jʌd̋aFA:%%xm5䜤p!FxNPrځ@S I҉"X#SrHQK빓®O{됖w7?DM?~|Ks5:=ӷ[[:NM~9Yrp;wMT׿]t4)_Gwa4(xPlfuBq¾*[P~~l^WC|?&.NKvVncZ~WJ%F[;AI9_wMLocZgdqrd }@l"=k;%PRL)J=uؿ{xoBg1p9h8\$tVj" S+'L{u5J<ŧJtI\bVuϓ> -VYS}MB1z` K[ (b &H_2ME/O",ՒǑ۸O45j L=JtWwgZy![w? b8 +2W[[IT*躸K_mޝ˻{PC^GRȬ`y,"Tj.%)Oc9*G0dLgȮ4hk9' $C=ysI3 Eɩ 9U>в[9c!wOhrg1]ډxblîk_~Nȭ><ͱsəavtQh qjղ|4G\YdC\uq VHƐr2dd ** U*`vVc$1-rR-=c>J;Cz Pe;Ȃt]n6N_dڏ~EôR%<1 L2rХě . +G#P>H$)DDR(X;km"CцA-GZ#gs J2ځW#zsքF4'/N0~^7yǐzg¢.~Z00P\d[IPҜ CCRguG{*aJ2`9x/1 c 7 ό`IF72)I%N֘#L81 V^@S>mFf#tp_!>_/ oTgj~m+̔s/ڐ-]{y_.-ը(M0FLJ[YŤV=fVwb}%Աlv੼+v,}7߀ξl"}1)م\c_JJ6Q+B5.%I6hc?͇?ZpUdJ(*oELe[ X]e7ккAB[кLGg<2UY'+3\H"C^:IRNTVd-rf=VYXf1!Cp1_ZWMtG:\rlRˊZVnݾu{ƛ;8+[~>i7r(,{0aW'ڡD=.LDCh7\ytȹ F-x‡$y![IXDˠԿQO}T~n9w{9hW4G&xf% -e/5OQdET<٨7FYiʘJ`r !E0Re6$q \Ю@o%>3rd5rL pS)SuvA&Axn~yvZޗ6})9V]ӥ޸Lv)A%c7>I1+QFp:PxyƝb3(I+뢉6()Fh̆H1(Eh5.t9yC iP^YYd{݊ EwPpOer XNLuvN|>,iel"_%Wt3ORzVN|mkmH ɝGM6oaS"߯zH"%>F"iL̯ڱ6r$H.&G(FJe& RFねԞSM'yZP)Em[v}{?:g}-KmUۯUW.ƝGKQs%B1z%K LFKؠqJ+%s;1ߦNktĵ0rg޼eIoԈVao߼}U<~Y?ػQ9.:Fn4߂wTwaoI*\׼8MN?~u]j;-<~ϦJ6LO^f%g?]g_SV3f4Qi|n&\)~S\o +@ M{o5ФKuQo: ;B7fm4 `7frB Η/_oh\}̸xG=*'?qt2֮]rܸ(ZV LFNX:ʥ(#IEK\7 Se/++ʇ%bHř0(Yrk 1Rb s\MQ;~= Cu#+K5ef^愿¯G)w0x9-u0Thw0GфKV] CQȲв|׈ .J*+ގ![Ñ7^( iV}\֫۠h1W_~Nw+&fzj40kelpe~+6tUFn3[΢y0?V`e_5|`HUObZ#l2h-ViI&UU2G+6v: Bцm#jSfP޽,ҬKk;&R&+DŽ+L F6 1Ӛ ؤv\Mb__lRKt-<sص1^X"`[][/jδch4Jʽ$:׊KY~P5[-kӖko}HFz˪#¾a(l.'{O9يi+'r+\2B0FkʼnLqdsU08Yc=J.Oa;<~aio5b/(qđJĈ28K5j4!G{휖6X euDR|,^a,buO̮I^r~ $g\KrAfhȵ%fGZHp8Uk! ԹkT!+<qE8ht+%Jp)9qrt*lXz:⊉kwD sF\%hڇUB:q tD*hUFvP.,)+!$CŗKp:qhCc.J)tLX1CW e'RaLH\%xAKш| CB)X'\>V[[*GJh2J:Wzͩ)K7, XgQ|B xELڠ x}WLoiGt3i݉mAhe%z\0Huu9NJ"w#-?aa[{t[:LUTHoGS+|_儥`eG%[.u2gi$7JG2숄u ^H"\JlB <-FҎY | \<Kȳm ΚΥ6rXR`luuẻ,DLV>ݺur+*7zgy+4TGñq є2<¬()AL>ޠu0?sƲ]W­I^G*#R"%1dNF03,*͸KaXGh=pcAP@gaJ-*{cVof]cS"ipQ\ U9"Q5N;/4`(B)Zb$(SR[X (DƢKPScvia"8mwQJ-Dbdk(>$N0mcFmkkW#jaܬS7wٿ˟o^L:yAB ?sT\y}g1φ`4JR`Yj4z ʏbzmx~Whe<_A] ]|T)/ruowTwd+9~æz@;i:>(00(U0{a)\^dSW|TgB5F.2`3jz~9aiQQ2ϖg'KXa} =8DXZZ҇~+[h켺-(cL;7``_T; i3`O`-.mҫ[|+: X\:5$u,^TRb|ڼcFTĭC|-2{vXH zk+yBvovjn+ku}aH z1aYHz>^39̲ZK/)"MDU` âIϲBH<i&i0@2^`t X `>cѬN@Ԯ68,tp.CVvVdß-EfqBLJv%7;&S2)8h-B}P[lBVu VWm_.'f{0x,{m/~7{oTn\\iO~Z`8rXTtΔ#QC4JQY0sV%=*):*UI:rNq$xSnq^1$&Kca3["n"Qܪ~(C0!B*Caj3j *&y-#c ihY?l Κ騥l ' ~nC]Yo+Bf.1@pqfgpdY^`,2bJ$ {EKdn'fW_2@^^ήQd} x=:\-_+ذfwtw=.>!mMV&q͚[ν1gH1-l]asi{γNmvƧ~fv ܵ۾0}ss7uqë_O76},>Fs ye^֌7 ^ mBKJ[0A!FyeA1 $@(ilxI Z5A3ŃFؐ/I…4ЎIq=;X4sy!ҿ~ Yw`)ʻ+*ÒaU1;x^lXՕ1avzsaU79ywwyZ~|:yaoİhd|oϾb6;)Eު|7__㷮ot=/[ڰ") i?\&pu ѹ?\vI3F3*&ű|kGwN_#1kT㪇NM.p<9ywgKg69ZIoˁZ LʥU" ]W sEЊX]>)\7&\9 ba>;wMmuNqq"XNƧb펧x-O-~QFtg-pgӮr3O긭=>h|::i@8x#E> q[B.VUЍQ}Z[zW^$%QNpϯ]{[^|}\%M5s&7H)hf-.ާ m!PkugQP[7`:;7[m;& Kb6 ^{.;6(d;6iC>?v{Hv]|2]Y[RDXCֈ]#Z\mxVQFVvTsE4 IC9!%814 Tf~3$&ڠ+d(~׮΁LzJCJAH4!39elk=݊CSlxHw6ؒi-_u&Yϗ7w'eG&h`a}>'@6h!G Z 8EI(&[Lr{[H9;"1|f<=y剥9#c@ؒ<]q$9g-r_MX2XLeR>r,/,gSjObvuQ=^9%2JBJ gY{~iGy(!jˢD^J WIkn%Fj"ckYk$U jx;i)3:HS*Xcc .Ec&8g kes   Aq:("W[{x=ZxkCxi:8"AtM \‘BrHfTiCZ2Ԗ2܊v>U%=cC.0al :pep8F<(HӨ4QHIE34e QN<`lYQCVX:ؗz>kp͕IU7Yz?vtJ $I08pJADPJH+T&:Z%@9B(ݸovsЈݑAO$)fs- lp[A4gHa)S>nSbhWbӲV_u{_@FmT 0)jz- d KkGŤ3jN8 5IGR3/(~ѕ4;8(b2خd*j+n yhT֩)R%uɰ`AgӀ qNdulK=xPuJ%B6iG)H:s'h'qEK1w2Y@!ıP`M1nl$I[UI$d`A6ƘVT#g7E ]]}Jp 2W^}ЩMlǦu:o| *ָ_mWV"Hʩ|D0Twu4/=ټQѷ/~5:Y(/'Ueg#|*0zy6j?x;8a噮 oE]$zȅŽ&Rqsa9* Uyo%L{wlj>ܬZa)ѻ˖GH˻I!ԃCfA0nzix EYx0؈4MG-o-(bC!Vz!6~qe.W@ՏI\U Y(FNh2W 1Fw ZjlͷنLM5YݧgᷨϲݾuږM6qo<.l %w`06v0 0QciAV2 !D-Q;** TVKQc2>R<EY^dU ـ9wG6Lz HE=A3q]YܛW?] fyj] U啡FH uG"=OSUҫH1rݪD'(L3=<I/bI lB8pAg&`N!s2d HBFTQAhˈU"U$e2%G 4DQHPuH4U^ךPXPGF!,Q.zשZMp3N8[.juC~l^ yrQlo̴wB{+;y|cPT+0"RN)#B$ !AF0AK ˢQPE9)CCv8ՀrE0) #(T PI֌٭abg ЅK-32ސX؝o~1|7 *7OލO56Q ЂbVXT@ .8+|I)&A2Ix +^8 $pq&W:(,lP b"Aa٭ L̾hbq.AkvKeNdDz"WDG; MΛMY LR 3.$-.xH-dDTxԨk Xd&($FuԮtbևQ%jʊ/ya8hĭ]JDc$嬎Dx?.Ֆq.bRJj$M(bU{eX %$*J6)M񱂥g-sP4_QwQ!tQuE5:IVT]}?Cٛ{~Wplvmzu܇=P>S,"qY($pTz"ِlx}8쩑e]9))QbH/(Wr9.x<T69 0T1JY@Z(ϐ:֌;%%E\&cT|.A9B6Wk\#1RI3'kdv$%gˤsWa r&]ؑeh tHcCuAFƢN3bбt1׾6&m2ItSPIQ>jmEv,̐8ror6{pzJN&s~ŗ.dhAWbLpt.z8r5 Vy;7yw̝ml*1BBt{ ƻ.Uɡ3]l@Rseؐ+KuAvIF^uH2Jw*҉R{1։~Fycc%եأwvӁHMVd8x<1[wmsQn1grqZS"M a4niHodK(a47O۵uJVqYD3f;¬| kW"7*WVpU?hUF\#hjW"A5*WVpU=t\UF\!<(於+ֆ֡J"`+jཫҎ:F\1X!\`&hWUmfzCUU汎jWkz'2ܻ'4W=ZuP\TˇU?qP2=׎کWq*xq Gm&>_t|۵'󧠋||ISГCƿo_OgK43f8EjO59, V޹-|CJf(]znu4jzRw|e7&\8m2'˚潛'RQE,s .g Wg7r2}onWL1J-jJ򔝡i^SXQ0OjdkF~R#o9נзbW[J:f 0 ^W"k{ffZ6Lj+RZ!4*ػfp%r5@+jqUUqu06+ UKU-xvkuOwRޟ-& 'hqd,vYA^y'^-;۳@O)Qg.|_r+ssFw/ͭ?v4=Jmm9W2/گϊ|W8F_Qry_w˭M.:K7]^E]K|s2nA~vkӾv-;ګ/f۝%>LKa"X|'S?0QAU*]#t՜ӶQkDXkjq%* F\#3mM`T3r\Z=UU'ŽW$Rp% =tn~j:D]sG\+eӫz n PS R`Oȋ$릝}~ʮC'<)Ժ9TmR=(FJ=0b#Z;]Cٹ)S^;ĴEs}5؜"o3L \VBC쪒ba`4*\+jWU%È#ĕfB•&JZ D4t\CK7EpE5p%aB;` Zcҍ1"J[*C+qUUqqhT{\\ Z冎r W~+L+k着ՃJF\! 7]Ă+먙UCǕ8޲ Hm*sp\Ч~j큎XnWzM妒QͩϘV^YJo܁}Qi}CdLWZtU{Ty#_J9B`nW"׀iWUơ㪪48q5+ljWU.5WWU#W6-JSCsvUWS{|T4#WF[~TOÕt;`$\U+QG\!,!QKޕ6UK Z?xJTڵ:quPU5N-ԂdjW C?g WU%:F\i-p|(5*4]UUU9|R#^W56*ص VZ9HCUU܈#ĕq%J:t\U\U WUWG+XۖE0vpU: DA:ʵ #W뗘 '{+ۙlW83qv+DQ\UԌwUU2z\14w%TuϽ鸪*i\p"f&Ubד_kD]ͷ0\nyWFQ;obsߝ 1^Kެ jqͯ5Xp(ulFzcz^,B zrN wΧ7k1Յj[]}1/XϳUHg6]mqgkT_5ލjϺ+ꬫ}޽ܲí.^Z'L/bw-57>kZQzs;P G_OGeC?vu0\5_HHc ݡgRuF^}PFSr- M˛YB'0yKP;w9*LVC,dMg}@0hw }7g(p|.-awOuy{v#ummT! 5]۝\ Vucg3S]tDF켶m4hx0ךk"/2GR2ئ2.ĘUZuZ.HlCv7U64Qmʙ\ AK+FΉ8&ADPIH*DmTK'OPXG.%K+阓4#QA%:]:Ee7YS_CNRm3&娴uDpUgHFc)P:fbZZ=䲅-=&![#YSʢXIntLr%(C벳'HxĢQTZ_Q"cvbКI٭`tN4)-֤X0NJ)#>" -萻λ.q9+e&4_jc+V( vOH`00pS:|Zd·*-%E'(V>FdWPQ^AYC sPd(I6t=t :`Y.RusVL-V  XF;KrIn1h /90hT*%@_!8c(sc.vv3| $$L)},hpⶦ3-Ah[R} y@5 bu5 ,h-]Fp2팢a.y E,tMV}ŠxMR5'w N#GXѡ/CE7L 0mQHitrQlFR9x, T],P~ xm,mT11mǀumlbEtXv=ԵjsQLzЙ d*WPZOk[rлG[͉Ef-'\sA-i.ס@N VAPkMyaAd[̀~1g=pe6|^HmϟhJ7a#.`l'aɗ1 3Itk oޢp \8 v 2lnT *3mKcbX`XKEw,*f-$ǶEp$bnŲK7 \Х5λ.?t= }#W> ՠLԋ/b|-n:ZH殛D7S!1-4&s?/y#6^J 69]]9_{/?_^=w79/nn8L c}|77O_&(ODK /|5Ï_<|S᯶nn 7FGpclag@{}u85=#UHIS$7ڔVV@${bȝ(J@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I'ܞI T_ZI'}$^pu;3CҺ@n$$''H/I I}$Ԃ;$g1@_pK}$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@:$I@8I $@@$1 䣵I@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $IM/Y0 4Np& .> % t I?B8O(u{xܔJy7yKͷG0(|=Uv l~6W7/iH=nfR1~HAƛ͟Jd$%( @IJP$%( @IJP$%( @IJP$%( @IJP$%( @IJP$%( @IxP^ZGq Ԕz{y}xqޮo77Ⱦўܔzr8L^ hY}^m浌)^ NaCW6>^ x#+G+ CW (thc\;]Y BWGHWN#A6_nK,ZkQ:BJDW 8CWNú CWVw(]#]y( gk+Ffĕi 5l3}bDv `gQW ,ϧ+Zڏ d@t l-EW wuh=͒F+ĩ'jL80]{W 0tʨVw+/tSuѓ6c'٨~}~ںϮ#i> C 7QhhkiFiҴ!"v7˷|}Ɂ ~|sN0P-M7(T3l8g_Mc6EWJ.M?_Oϩ騌s~7Wgzj|1/&}ig?sLЎ߶dC︣mYjs.n ܥ _tU/~߿~N|Щk'*Mŕ4͆3dJyڃ5Ƶ;6ts|~VZzSO2m.)gBzs~Д6Sx:5䅌!'6 4Z[&O)yfUC5;~S NRVȪ6h1zG|z10N([톱 h]fQ,1ZlK֯y{>CW 7CW@O0ם ]]EH ua prit("tJq 4O|+<#ZNW2 ]#]yoI]hZ; ]1r1U evDMjZҫ7rm#"t}QDWh+(th#bPJ:ThAZs|g] ԧz Qkei{cCi( i^M.lb4 ]1NW@DЕ18]٘aaMvtzmO"tE*m+0tZCWֹL$tut'+v+k1֧bR&dd7mk+FUT|5]14V;jF+FVfQ-tutKC pQ]1J焮2t8Di'xfp?@weZuWqBW<ѥ]fҖBt EH;/}qkQ4s4Í4:cNӌ)#imcZ#{ JO訕?bJڇ=E'>#0n.OXYzp.Sfvg=5ɣZ|n^,u?_dbSjp]b'v(C}RC0 ]1\OitFˎ1`8atN:FrV羭t0tp}׸`b%ZM.83 ]vʠU7]0Y? ]1\7]J'{WGIWtH#8V;\|hP&%*SDW CW wvFQ&#tĩO'QWW67i{@#[]=* ]}Gm\ZhX\TQ ԧvi04p֩vfV M!MkuzpKd=d*?2n~o^7&~Wh/<&t߾r:ԟ}ث+Mwd7oO&yrt S\(ί+'AmW—{޾+ li<߼o ^%їϽ9GOw޵6r#"]\fM {$] `Gq,9zݯح-Y=3r0O7,U_0``9.b `?wGhw=~-mCi?$-qh͛pjQSaTfIYH`qhkdZǔ$ͅ:fTv1/]ZTA#ZxU]3V_x4OԌQ8eƹ,Ag*r!C0m|e^F {4|b٣q#ho؜^vlt:Y>}NN!z؟^?K۫Д51}D=I7aS2+jԻ Co|ildfɮLCWO.,1kKS7n}u!8Δ+dLVZUa Es™τ3qLg#™(p&jGxK4Xfd+e=S!gzW[cx!hO8e]lVy!$PEtJޅ$T)Df^;&-r85rv̙^>&t)C˫9[Zg^oa|_Zgi;`P5rF'T[:Y99UΌ\M^<[OFIuBݍv+Щ:ǩ%Y&^Fg_Y`73#'j?HZ$8潰ʐ5Hh$bp.ifHQlur*I}a+Z܆7sU?=bԔaoso_,0F;?\Jz. Tss΄ʕVTv0u:_ Zofn5x? VZd X% f=0,|xU4ożGg-c%r[ Nkm ڰ6jb{BZBZBr|2zLcp=Һ/eБ:p5pԖtRcJЋ ?.PQ~seq[t*:6K?/ɈX%0R*h VkL"p""EEI' d>ܒ}1Fm2lI.f23"lFDlS B>R"RrD | O6#EZ#gwHm =)Dx$F3~9AozOA5L.$]&dJDS^srb(5ȄRӺUAEɌD.V o =QRJX+A1Ɩ5rv *+fT5"+GU 越ë'V&'5GۉJ.j_C}ydt[6ZLP"ē8V9{bq s\نOo_Pn,w5s6Aœ1䵶 4 (uJr&cTTTQ[S"cR1mm8%$K<&-<>j[3FnViM']yEoiAIj^\jǷ8鏷~SϟXc3YPiK$GdBq94MЄ2m,$t&J| ֊,igMM2h wɔ Be.J"%bfضEnbbn;ڼeOZ`x* 95Tҋj&\lKՋe(Ozh/QT=dE0N8b)D,oX-$*$a~ҋϡwUPA-¶$?O^#wz_61`h(pGm^G/H;VMgE/F~2y<^եKRi$,T\1Q+L̆zOai$;0H80H80HA9J/ #GBR< !Lw^Y#uL恵A$Qr#U&;V }&@:<%GoMLGf̵[#acu5~=G?-+Wݮ;[vNm껌ESf}jP:Ww -㜑3i!Y'Ph"q!8AyacYqC/ڵ it9"]>HQ p*GJk8^y4N+_w]f0'0`3WVѰQ5|9+T˲)be @sQE)В*T`ev)Q'HT|9sTLA 1#N0)@@1;@ Q: 2)+V$.y ,2# :՞l.t%# g`Vj zLmpt\\ x`LX9q^(A7L %69{{a{u-;"X{Vc.R;FBB ]hEi+$KIB9XI2NEBVe\C $Y"XeA:!WX 8ɱvdkެI^8;9=)FG?hO&́LaW@|_ddȰgk >(+bT4 YWs_hEG7. n@g\v6Fe9Fi:?uo`3[Uhf>^5ejG~5BTo:3p_L'ix3]٫PmlVE#uJ EI8ziy0:VwMɿ%6\^v wG$$}ٴ"g7ѓ>.})hhZni՝%F{*RJ=^C [֥^ 4$Y|LxI.zD_4gj6bCj:=cK^4H~`ڥz}mu 9g ʠۭס&Oq.X@ƻKRvvJi>Z=Uڥu]Lb疣kԆ|] H?Qz)`?zjpPoi572n2of Ge0{)˗tj,E}RnYYwf΃λ[l j~'΀\RE,"*gmY'8JTb,O [1?*:$ 7:Ϡ I sB]Aătfj"Vv8^KxbԣM=31NF,;8NAD "RV]oj[mPM0G@}zkE^7ݏfb=/ۧmm};`g3$y_-i$9o-ɲujr;M6HVul'7'H[)o^ZFvm7NqN< @68ઙ!x fH{V;ґK.hA Mnкu}U-dY5yhH7n:u(]iE5a&g6n>lTZTP{?." ;FM`%hcPGâ>`wuX#ںcsL?(Y Ѯh[pX,Qd eJ+gI1颸nq\_.r㙵7 E&3R3LTrerӨ5! bD"Ҩ͠ƠX4a#!]^ wE%KPU m~$(ڱݮ %3F qAG]CVBO<Pwv N{ eN ֤#8&FGu.Q*nYʗuyOQ R*e *1tie$^1~gٞ_߆EӼvW{w-0P'دK71"z.F.Iَ$|s~2,3)5􌹿B"^l'|'=ǐDZmv2'l % KvmKnӢ$JђdKk,)Qp2uu5jjp3k$kQLamp .#sx"!S`Tco(ZQ1G 6i:2 @Dw9lVJHZh$-[Zgk }cNi``<s%/>C}HaYmOˏ/9TdU`hi)1H-h;!|+@Q̉L|Ek H)L)IU2m"{NǩU!Kp3"{hp8%`X,-,xGI4|gBPk'5WP9ծy˲„9O [fdFK˙bR((18MATNB(byMԪx[!ZzNdh?ec\b"v7\\>ҏ%UKv%tfJUIQ~Z0`ͥq )GJ-%1Zi3l'.Ժ\iJKJ;:8TV ;6(bIL94b>ͣEFEUuV`B<UX-zg4kL@ZFL&ZP$+%eu5p֌?8˜EcAIiOz&ם\}ˇqZNǫr˔ ؕi0ICקC<,gobI)OBz5?1keegz.LJadutl/:\U0''1[gJ| S?fybt{%4ub'4uk*6𛦨t A NSA5+3k;~9e9G[&5*Xy@L1LPq{4Rr1x=Uc>08\3Jdc>@YZ~]97Kӓ`ѫ֮ŇXs;?-{ʘ^JE% ^H,gR#Ř" I:pym:p#\i' ƝA`r@qө#4A;vQP׳ ;ŦG ɻ p !Jex"Q!ݘ:&,ADh$ KAW6 t}z|o ctH\DMP), LYa=FynYD4sq<0xTOUE'G PtNu!#Z@ucLG)0CB`,;ow}&]WhCpH*I(j[;l\|%y &tshξs68KL+ާ s|S__ >rcgv?2Fg{1gONΞ0,8;WWWfVyo3Cz3Qiv G")w,Ս쳔% FR T?$ˇV%sbz`d }a)}>Z\;~; J5f\ _&?¯G)Y\9uyjzw 03OS7ބ,э MW#B@8ϛ,N*4u[ߋ\43i^]μ,䩅=*2c(~1==ޠZӫ+r|цϊz3Ήn@W9!n HtkQ^Xѷ?r~FR\;6@go˛>_>YUBo~q0kE؞}_bp|yd32L;VrgW\r͖g `xku-T4) hh5&A~lQ}*4cf.׌8jfiנ3cRH@aClDw"<0nyrͽ4Xpg1#W${p)BR1(u$fZ3!bԎ~cIMj邮'vzl/cc! Gq g3A&) r< *xTT p5i5$]py#žSR4M{R01ٜPQ1@>Ѩ/~_ |DS[@hi=A4OA;|ImF'Nt碓#-k*S "fhFʇ`I$H@#TmZB@5KoUEJ'0\00h6(҄HRxYf_gZB[gxi57ٰ/' _`KOΔr͌<*,MvXyO,I]V$yy@P L&' ;b:lSFQŐ'+"֒&V;r[m;:;@flBH,њ,XaƎŃH<_eS׮Ơh$0 26KY4%"MTR#*##P1TB@Ѵe:k 5,՗]";DM@Kk唐ByB@3#8@BW@ 6*R`:h>~/_35hFJ[AE*@H6K)RxPȹNjMrNO g{)V:DgbPJIg5yf a8'(G+cZÉ;^ᖯ0\^-uL;L@ Z吠2TjƩ^q%H!i8=v1 e% "w%>25 dP-5.r/)w*CB9LjDEi " :h<&ؤ.PGkrW -St:stt)s#p@K:dA"4 9PEu [983PJAx|jc N) `|dْJ .#rX)]Q0n7͝VlK.w$#bJ@h+TSAigID:/voʻuJ2 ̸ߖaW?sr!byA!@ 㜀L#G-JPP[T 4.ZnU#)k; (UK1r砈zK>*X<]Db~DؤJ@=Ч9MJ/U3z{a)Nѿyx˭d&`ͫAvmm%|jo U,A&W  } kV+pY\ $#Fu⾕U-dY5im" TX74I&:,^Wϳm?[ޜ]4մ /Q|ϯ_ա1̙liU_ E-}[;/tIa^B A ~NqQJFJd*Yl/u^E%U!I:9\#N-AR%FbRZ+#E97lN.VS?E:H0 N//7;* 9,)IZqU[q9nEߦō^M G'IB~)QdJ= S3ke,ZQƖP%mhoW3˽jexOt=< 0V'OlF+Zϛ$?w?&>q fqy<,=/^Vc߆A#/PubHZd'~3?@HV aFVKG#%huY u(%K`]>2caW Z!w]%(Ȯy %geϥœA*~WϳAVrblգ \FE 9ٻ6r$4.~0p3y0/L2rO[YHr_0ݒ,K,b9Xj"YU"ھZGlL{{{:JVX>ٻ/-'*q;9;+Z:T,NW#D e;P^eZE#|4%B,۳ 9o{L,c05%a4M1Q-,:vF6x&y{x&0z;L2f3&m>n|.a6oa6=\y{Ơ L A Ƽa`&(Nc8rB)b&T]oȋ!u&/K||::[$1˽ /v3bLr Jj R$:hh*G NYIZ9IŭG4ź9%CvnQ,m%l xH 阰¦l~&H,piT|o16rP<(hrg9hMPAv,σ *R#S LƃXa=Fi80@/݃Lq?xROUE'G PtNu##Z`ucLG)0CB`,Rߚ6B%ro-{ %.<2xއWƝGKQs%B1z%K \FHp7NiEenA]o9y9o9v> pB?ߓ˿[ ~zo(GF zӟٯ/?U_;fv<ruZܔ{G4ݾo/>wٛWw)wůɸSO?o%ˋ^FW|DMQX׿|)&\ { }ռ}\t{W7HHwޞQ}^:#fD?۴JG =7kѤ#Mh={1o <9|{Z5~6\G^U8Lw=hp9{\O?Ӂ?baDdj*EC׻= tZE_]BՅقQ EZC}7nUؾY5FJ_WE&9=-^Oa!F)π2(k@d`pq9-uy`8rLw0s*0`T6%C |u@fR9s`FXϛ_;lNo54e?-@A.:3^Ȋ{(?|U\tT]q^W})F˟~7N w&ή B­vunU{;krcѸ[JsU&mySn&Pg+T#3>O_cܣM.jJP\,#8@_$QUfח6Ro1;&vrPT;5u^/6 0u?`g_1IMT+.WTef[טVq&jHN|\PiP73ﴻ3hcG7& `ͽ4X0Pt\)+DŽ+L FḯlR;[=&] o@wPl/cc! Gq gp̴ch4Jʽ$J8ųr2-g h4-څ{8t ~tF@%ך>J @ L*JRZR~ྔIBAajK-8ycp <޻uOĭhY٣%$wDST~FWЇ`wI$8>@#Tqk  # !ykhV}ҍ@D҈7G(y\^QZmQ{' ;b:l1}0U HXKGX EmMr0AK|c a&Vފ Ήi(Q<80_4̃E ]eP"4 LZ"$"MTR#*#"PQ 6Pd4mYZ#gC9r$f[5GN0jdi8R(HBHs&qD20I D[|6[>p&DM7s "W *Ta4BYJS8BLL-74F$(鸎qFiCt&Df8-aZgZf shEq5[F"`tx2 3ujCJMTjƩ^q%H!i8>H[{;"@( =kE>ܲ56 㙡Z:j\^R U 2s*+Va#1$HL:G{15J7^t!JiY:ȷ z9N 3͊HL/=\}\/ k l`4I"Mhuז(>u%NJhFQ:Jk6"V|儥`eG%[.9ϒ`j⑌:8"a[[ )WKMԚL1jve5m-r6 ?զŊكiVEN%9n8 M"p M)Fs0( AJjкmIX9^engaxiX^G*#EJNcRɜRafXL~J B' d  ,C)Hpm1H)4({*FB(w`<x1 ~`a@_jiHLIm9c1ƃ4:RR{N,5B gDۈ H 6{~E0qﰣ9D2[Љ*ɼH8EY,7#xܮGڎN40)4tYݾNP~ r2Xg}x7 S[c;Y=j6z@Ww6lڛy]}{5]Yi/dc Jgn#l *V _Ǜi Og V ?{2`Bu]RAw9 1ʩxq ^X}]o;Br,fk ئ|MAoFƅx+i괨UA /'~𱟆ņ:^%TKez(y 9OaK2a~J#rr^?x>o-kN`:s3!!s4|3)TfZDaܾZfn_u#VLJ=fVlٿ.Ju:Ǔ ,5bDfGnB^;Q]ݫ2B%}xQuf3FV4~`څs&]Cé2t,9Y>A;4O.@swgjLNݭWڥu =$d ZF5?|03mǫ`&狳tЄUm#bUdJ?uAieBK)RPfFi0o ʸ"3A,d87N?fr3H/B^vn$nr7ީV߮džRZbi^X{u ;.pxBzNb(I1p$2y(Ko}tw_ \ i0NXODkʉ:Fp2`ƕ0!v8aji@W:CK̪a@DTk>f?^|Pre]$*X,Qd eJ+gI1Y*М/{x\KaFwlĹ!TeCrԦ@2d MCݻgF~HA8HW)f(%0D%l#Ai<:4jҪM,Iz i`g)Oc.*5XZٖ B#)h* KwSW(aAQGEl-(Jc%shx:J7:]ϛ..i5H:0 ̷urꋵ4 N.-2rfhkA/URq &c\g8MZ:޼;sEN, g3,S lJXCVzM[߫bVC@QB;e Q'+N,CND\sP\'Z?*~`nô4I؂PG#LoqDCu2*(?;*^V^iӣ7^$@ JuAwRkHC6YgEAtoFܲ$NGݔ2GźNνRy(z 7 IPt|iD@S佖$"\&9Rr3?˄f5b5"pMZʂ@M#UbwaT1\*oié[WlܜPَ&t uzu8qz &uD%)V˔vDc\ xIN]~򚸹ۤ[gwYpWx/{VŹ+PZ06߶4+MMSꙶrFf=̞1 2.Ͻ^|͎*.z^UCtyDZf:,逤{j6f"nSm+m[]~zqטo?A$/1H#\ 56#{Ds>˷aL}*Zbvy_aIkm w/3nޤ_R@[83KlKW5,}K$)`\|NA3@&W2jt\Sek[.$~.m>=AZ@syå O=m3@cĜE N!HrT?{WGr 1Ok5ʌ#46/ ^$)X~&uGpo<"-6) džowd >}M. '|@ɛÎ] :Uh0a(gG9G~0{UtfA mNPmMR1BjN̅9$H̱ JZιT!F+?`UXٝ= (g 8廉ۣ`*+zHOj|4'1R-R3itw^ʻ{?>_}=ʄE):*:%ߊ<:Ac.V2;jAZm1zmbP$1gƒu;ҫ-U4)gsۉ$^l-[bDmL%'HԷԛ8 D ˫(X|oj< +pG_Ǔ?=#3߸/GfYBB㼲Xp"8hFƵ ι8QR1x{1ޅJmm#~ʎ͞ +ީ@ǿ㛍MC.r!ўmHVu2\.89 c4-p-P eeղꑬ4TuE(9蔜ҁ&;e5$&cJ<1\o m[J(p\tڗ#i9WUxj۪f$m-f9&Ξsl~ۂW_gxU'g3fϺ}9{u^Lom?w\9gsׄI/;f K/f/}qy=Ɩrpkt_!6Hˌ Ws-nW?j6oAٝ=;8']Y4g3;9[hxɻ/CkgnׇzˣSnǼKD]m2;.C4oܵ=]ڨPHTs9RUoL^Gr"R&e%>%ocS0E_5zY ]"-1z$0U g (a.ʅP9zM=R#u@<<ZwKrBm;Yc5ŸheNd.\|1k )rFKRrkґ3@ZEyܢevu-Z,W`;ymjo:2%ލYf~qP '˛ݓ筇n0ҍAk"Ho=E:*'k:2RY~9_x%&;)|ѸxZZ~&)gU(y% p@ 8xсِ8Z#i֑~*ـ=7 ԣ﨨GCXel/y"Ӝ:/[ީ媇e@S&HGyjޓOe\yJ5\@ @ "@sUb +p&Q8H.Ƣ IbZHuFЯ:͒NƩۅ~3vaۅ ߸z%j=_^h]vsL;Ϥ xE)bbqܕJaAӸT~sVqp98UFj1^E F;Sՠ3SMhեU>)0gN*Y5Z\6E]t̚O93r&P0Alg&n?sC.Lottssʫ +Ζ &jtUڑvׁZN课R?X`G/I`qZ>?EC^R" f{R/){VA0v=F6:,Ty45xA|F'3Wx瓫YG;xEPz ,7M?L?]^_ nxM|WX.?/E(5u<v;h#R⏖s]GƭEO1vvz%?|m#mv MU^Qy[0ʻsdHR\r͉̃I╜J*F5g:4sbCɮV d B{g|ް7m`4xC7=NONtßЅxfVf1KL$1R+r'͈kb֔L6.W%ž VЂA`Mlg-.TDTJٗ8;^LZ9M;EmaDуZ,7k DoM]kfPg`VbʠuxXؐH$x$NX`Mn58P%#`|bUzٍ~]P78gDGDܦ1ԪI6&x[E'J\j= RưĘl$5l&G8:$qaBZnKPbKI"i sRqK݈xsZUGi0!u%"4∋[W"q-4JlJղrREseV9I$*fi`MNg910S`;ҎC{C> V)t8N˧s5(Lp(fF{$uF/TgdQ?1SAn"xDO>! Ҳ\T#kmQ-h֐cMTKJԃHD '(C:$񤠺CUHfwWgB8`nН}J \Yy^}_>cd\kU%fŇLX;t! hRʁ~#5bxA\,|.2 1T{(ۂ^AٷnZ;Zz=[vss\1F/Y:,|4\K Kҍ\b+ \ +X><\A"x7W-:"j[<j:{,p%ҢC&ΎpvAؑ9j:jÕH0+hĖW~veB,N.մ SKX8e,%Iw;Rlq Ѫ.F:A~󨺔18߻N,{;q6`u~qqyO)@8ХO"{ [sZ'hY%\96{"bv&amu3!C0dlP&{6% d7cU] l}qa}J%R!);NyIZ5====UUu0Q^BP^4 -ʌ%< zwA%#m\'h݅ 29wJWh,Tȳ }۫S q,)(u" %Ϭ/aѾf)yw{ȋRι!⏼N$yFBynFrT@`C$$͆[cc!!ɕSo 3I)1K4YiZdγgb4јϦ|~s[9Ԣ&Y %aOӎIyi&J ϮP)AIJb9ꐢyoI0S\XV}3g1CEzL-[a%z&3n{ǪH 1n# h?Ҫ 8On2ޓ= l_U 6h.p8W]xO6en]su5) n,~]Db;PZ֢U5&ڮAJCI`CnhRi1'HVh-c \0+<2%]+t68՜䁣c;2Jad@۫0X˗"*RaV՛_NSKWiSdp)+4+]EMt}'.(Y34![eJA "$xZCRYW2E^tj娲!F)c)8FD)b)48A`r* Ev%,8rrr@%5];ܷnksȇ"/>D>O8#s΀0:A+}P* )Eo@EĘ;,鉷mρvIanfSk*,wr3J)Bo4+(>4%@c F1M0I as= . H2AT_pNztc7\L5̳):a܁YGC.X>"b1; 3وu <X}YO[6ef}3,d^ӹtEo,}P \sp3mcJyH`xDF<((4?P 9~8B=cçc "Qc"yk -P.rlF'88 @7WfB|8<:)íc=}=D\\cD6c:Ӡ2l֞],*z@IW> +=س<ZrT"t$ߤD)ɵC $ZE"+6tSu ޟ,**qtϜShOQ䘑4Rd(0)DBFfԜ!9Ǟ;̶x0D_ub *t!8ahPv[݁H^ꂉn:}8]h>w#+`ll.1<[`9 $0IEJ!5"mNwN`B|x>CWk {̿_ɴ޷껿O[4<MGat>D7/'d:mJLh|rl/]upm{z 'Ͼ-%tF..}C4yQ>|o~m..ϋ+S{K%Gwqve%P~!&/^.ASyxt~zµ]zkX)ؗ9mW̯n|BLt9߿_rmzZzhGW%bZGWKM- >>0_UR 7|j!?,TI%_\΃ņ1 eg\:j)YŤQiv]M-"w"2 3!at 4 䲣;PL[iHmt3HM Mfv!rk>4\{2C&>(ݎ<Ɲn%Kz Doe9,:hƳa6BD5.ٜWl%chVRp#yZu~1OG}~~?~upauE텎Zy99 DIb"z +I5Ifu:Í;sssnƒU8T׹ wxgd˜sBZW* YH?yB _ yrt*V1%'  C$4If8F{0FUybVE8T(SځV:{M`d}22f@JzP^9+ ʢ]]-p_ul'. i 8^$`\xSο{r0j# 1'UZ)y 蜁H|uT$ˤl ݚiNB[ UPI}V -:&%4Z$c: 鬦hnm8~Z)_jC JHYP Hq6IDho@H~ћv\ &"$7 7z>}\>iBjfRgDieP$D5#$tZa!VgVzR'͉ +PGrp9cў,FUT^ziWMZqt8h3hK;m)I%QjZY@z C!x-,;v5NU%=9 EJ|ckw VIA!q $*]*p0 t,XNB B<#06qܕ:D!jR6חŠ.UePҵ( QNZ侜M,,x0u: $h^@):N'Ƶޢ?;y02T V#=Bf]ƙនT\g_NPMK{5ֻ䎻FY'mIR'^B9VDf"G}GA4#8() ;㌗I%~}Ds,1u҉~75ڈ6[u ಏ8DfZ?|x֋M9I%7a'/g_?O_l2J;CP /+?`f3:_Go.Ѽ'LHXtꫵ>%9_W7,xI 8\>W28>~yQ[\3.p=mF~q2v!^:ҙuӫieIN7#Uvq6SڟFAd<#cfMPs~6=[0>ܚ?p w*b]^Csfp|NҸ;,8_BMpwVINv-`T>a܆$+ vi[p}ܔ)K[)}ye.ggĴjR4YVCd{MzK#_ُ@F,"`E*O?t3*yzfClN$i/Gg\^5F|XH^FIb>$01vAh(MgÙ{z-C8" 7XǸ}Ka9py0G`I!CB8~JXE[\gIaIn-yz%͘}:y-û^;>,ȋ' [wd"<%mlGU []ISa E9ZQsM;K5hMo!JUekW]XZA|ާrw.92_ywr2aBȺ~rfg7Ghee \sY?41[-Jb+y:cϋߴ5fÕ4\Q+gr[޶U ߩ Q=0Oh HK9XQSgu͠!o4 "iW^_k3T%e椊ֶMRD Zh'$Qȇ"&i[+Q N FιTUV*k,P[WZ}煨51o }9 ۀN+c٤p *$!DyDi'0O}ɵO O'<#jŹ͖ɓCNS0K;X>%ocaP. *(V 5/˺SBs TTU ,& #fΆg1 QD~D^V6]n}Ϯz>ilǹbg]tJYgTJUU흡BmZ]%vKֵL^o_YUeqK]_&vu@h̓-Xn)+k9{5FG<ׁ)orjNʸ:.ߵ#^.gGoG\hJe(<ҺdTN yNS̉hBq4bEŠ :SՀVGYJ& >MY QyABٜ.bQ'r}|XxdCud994Y喙4SYk]V]O҇, sSդl]cQ.AyI:r&V˜?ZZΆ(|q/GSQmlJPQ^~Ͼ̃5k~{˘+QNlϩBVaƆ#D %NT`Mn(G(y!\u$AuEs?΃ mFFDC`M9p5|b݆~9N=99 \Wrujz{sjhy}n[]fx(y}'5V,ɭhwQ?˥@6h^C-i1%鐜o`G`+\ }7VZ2~kDk$@1yfck>W9 Պя ::\ad":<9V]EeN6Ʈl͜:TW,U$6 E9"`^.IXCPfm2YY&޽U/CK{FIZuk@?{Ǎ~J%``yk,>$EҌ23l߯=/Ic"bdtIK"["*@ 6K,`k X@ڍBRm=#<>|W"a49x}&ڨxV{TFUyRs2s`ݸUD>Ey(n6U<~^.΢Zg[E[bUMxiEلnY(on7` >"Je<|"#&:a :piiW2.x䙳s[2Jn}fL1a;ۯB A,Al)zSb\bF'#.˷]R/fOz%XߣS"0eEL Eb IfmiD]2I::! O!_ Td4DLEǘDRs@Py+ڎo- ݧʎ/(_a%, Ń`*(ŒsŒ5q28%WAI3.sInk[6 1/*[LmD&h9$ :%/e2l(gKR812*&68&M"kR6۔I2dr9[Y-WI+`1*ޛJ`l $!%2AM"%b{KPDї\B H6EݗE~bG}Θ 154@Z s"#x& QU4eswPP+;;Ip]d`.s5#9 %J˸mB0w1J}k'-x3j^ ̔\EihfK2@@Z,kI[{jw})&_J=o׋ן;fڃI?7*w/,8},Mn GGo8+tj磥_}H꼚J !75:uL wZqT4BM-D'fs4Ҵߺ7WɅoݲ.w$$}"y,O,[ٻZ@NՃ%FG*\Jg#Zߔκz:::_\.^94:K6"47fC%fv3/$e?{a 6]C23eptԤ+hs Rj͸P ޝPqPzOuaXje4*4nb%;7ϵ{|lĔ7E GɌ)]?ҪiK:CϲXt8 LҰL~Cb.H&YG3~f0O|o֪Uߗ~NaԏuZlxOxsK2X*!i g\%:N ãzf}=3`Pd/0ʀR8'4] bѕcY&.՝#Z&F4Gl4& ]uq (ID`*qJ*0WFWe%PcPU&пXəز ΐCl R5c!w}KX慙NRo~XkyLr^Ի>Iޘڕ5քjAwi8 Yok??Fw9~>_ VI7 r,HOz}P6WYW7~Qa o2mozOv;nU% +u'mUpJyz7P=I_fʑ E31j"/wZAT&Ĩ e91GL8"h/ \+C 5r6 ҉~_U$,crOm_crn,n dFce/מ$+ėq0JXw$8((ѽZ#{9@U%Z=cR.05BqS.iXEUQy+Rѱܶw䪔{RG*e[*2Ryqg `Dc^9EŒ1{32يlL~ %YXfrRD%'˚ 2,egFS׵lζFΖQő>~8P?-ìݷz1_myΖd U!vbN ]OV\>WFӚ buMڇ5]^Ƞ?!t)u=ҺjZ?ZmkgZVԲpݮ;=o|blB0 7>To>mxvy戻ytCѓݿ]¨䎘tӦXʛᙷ^|ILkLJg+ GD;ֹȕYpeyts(F-XcH:"J9OsU!2U2YRK}J,}Fh*?lw=+s=Lz.- F.80ъyVYsZG UτPe4FIKcYFpR U+[ҬY3s-{&Z#gK}.Ei\,\12of7vQiIz8=ǾxUY}iCl?_ȌQ*:nJ"*E.VU 4@?P ,4n4B wS O4LD ,jn1&':@`CU::Γ Ljizp+|w#BdU݊EwPw'*+3ePds 1 Y$fTҀl$[%bIW^O;g0NT=44<,@& $7 DHgcHKaĊZz;ӊ)p݇ǷtqĚKhʁ(r@RֈBK22{Q+p2:oM2}-7  DԽ@Qۖ)n&p=-6xI]ЍIٗޜ0 :&t 1Z q!/"2SeȈLuj!+Be<;20sLK Z&+Uat@.++X[hV߭a.&a޾緓j<ӧ;U.<ap>E~?х|=K?f耔`x|xvT˚ay˞\faэ6̓b^smۭE(nLaB5`m$qdӻ޴ ޻kxOMy{{Ë`^JyњWJP7^ۑ$X[%q\ydl/2W S*2Į8!_@VI<0Hm"pmعEe4&uhk$hҦ}2ytGN5p wݿ^m {rYe1uGZ0ge+gPcJ-r0%:aݜzf*0ӎb-:{ f4<\^訥s) ֠?MpAzߴi~<4pvP`+^~{\Z^ L,U%~)p}{ߚﶰ4'ҴQ?1GϯXJ_R3wn1E~OFyځ$pd ܛ6RF QbHxNǖ){^[rYTxL1aK3'n@A3'BuݜQ.YVʛl:ˬTA=Gkm{C Gg<|Aơ~$D/Y]D}2qW=Ltp2Ѳ20| YFYŬ._fpg=SfT'd- TPSB04Jyx.<zKI_)ͭN`d Qe+}3]8k X0z&f'0rR٬r}*`6aCeIYӨKQCX m7|O厑4.\O[}.\,$nުjOUdh+0+u jR_/2㓟< `1'8W=yqP֯1^1uy];hQn{nX .7LVLRA~j:$R9=^Z+š;7 f.G͇ݫG?KM&M($Y/hc>Uq7qQ?̚RZjک (o]D(-pzqtm2>~&jDwGHaz/]Oi -+Ә^c[VkWʥ_oƼ"l&^dF}Ek4JZTB!k9ȷMAp,wY$rN\ZWiӕA okW.HvFL evXXuF5h?BxMkX\ ; R gZZqƎ] *js,@kmA)3dW="vEsˎ]MV辳]=Cv%(](G*ણ1 F](-+%+?6}E+ 7mz8%$N6zw;|ް{ck~C`uLk2Y'[4x\7$g`n<4W<-CV_Tkq_owo1A-[ X6/NL.[ίh_IQ,/}3؄Cl0b2)W>m5+tM7ݨ2c s'umPZ*kjT,<i=Ņ[ys v [,]:n8!ft''G"C& gMv0vy'1z/:Mw`f*.%N .,Q ΒlTI$.y&ee5}z|5X 95 &@lȶ]^?)숻__n-_M] P[Ifp* #$yb^c% (\Z@gy^`0ҁ¡Z@NNJՍr qNǾ.O/` ;mmxOַEtgo_cm\GރԎ\bBFZC(P Fz4r0 .{ <9:.ݦy+:ܥhjAW& XA(IC) /G'-Pt'|w*zBbvp<J"䄍"1#!f `{{emxyDi&_)y QR${M`@BGP#D΍WK piJGOa>@C'$_sE`xsmk}#*74.BQ%a. Ƀ``%(!%ÄQ9Sɜu23Hqi0gvFMcNsBIMBF&%23U)Zb#_&s=qBd"miІ, A:1mJj01uΖtկ}"3ϝ]%D}aDm (!%Jj &%A/xhIdP$7@$kQoR.g4!55l X*9 q$t@WS0ZZr4It$K%GZL9h*Gqiۄ1KW}  '> NH;60 aM+RT7Jkg %S ,+;v5Mǚ6z Ev%>1-rQIی.d :|օHMZ&g9Z.ȣ;X*cu\| ut!R:st^͊` ntw͐Ԥ8Q$J" a6y`--q% +MOQJL+ f*ZzCS@u`Y993G,"Ŗ&{ųՁkҁuhdT9X("3%yHΒvlؖkotW$t( 2(} FS<ͫ$: {Bz)=$QL:]jF )^ӠLWA 11 r #rd.vz@k#H!iP:R`/{,=΋LiyVPnbs2B0B0P[nsBD&54HKԀOCmն22'T2K9Xj2Րs}fI}[Bک Dq\ T uQy[#cDE1AKLJ?O(Oͩݚף)mT HMru>k\+,r=;?STT^>UF1ZbiK ⤥+ˁgo羙i2}g4}5upj+f l=Khf_]ެvwM9VMHEyY]b5I?{ȎOzؗlvHp^mglG-ɒd~ǣbXX*Ac:@b#v9.]x vl9ZߝmG)|4dSإїI.gD貏l( sns\)grnwkp%;lw?w)L9}~Ewx XC>^GʶNJ^7 ޺]cѸI^ݝ{Kr3g* ^(5Y2߮Hh4,ۥQ?%zXu ̑Iy(3ʔ&dY"TXqߌg_GYw2uVNx;^P Z#QaBP>IH=kI;Pb)ʠj}yCO Wg.I p"x!cAȽDK+nᆩ I9V w{#[㤋 jp":r>N Bǩ^œ/ALTr0C0G0UuGΟ~([5C,WpƩ$enYHTuRI/ywWg7@yfxq,gPڔZ -LtHIX13Yd[//w"G YT.5Q C:`Hrj]5rvH'dc{YjPVx ]Ns'&WK}ګ=c̣G31 11_8jFQa*O&HE/*$!MJ20ld{쀒ނ졪'@9c3زo PgJ KZ6Bthm&#CHc1ў"Z ֈ}v51xt]~o:c+}:+4eͨ]93L':;[68ozj[nm12ߏz ԅ#C3i;AɎhoOhCfyެ=`+U@k[Ex;;Z=!vVmڸkz/i둺luít7yްLKppw~ xgW[TJ!FQZʧ9S&2IeK#Ε`XI= S(\@$|Ѻ1K "Rx@km{C?hD敭_nz*N%t Ɖ7Sf8RfiSzߗZ$|H H%VKV>kroJYI7"F&'/k40n评* h<Vw)|eLcA{gx .31*"'UA1J2@h qxZ9Ndx0L1jjlA$'S$,sNm|7jR8w.+/{)rˍ㤅bV'PҔ`%JqB4*me1Y=歲yϦ$_mz:jP] vohToұ?oWAT:T6YSJ$VA9LJq*;b'RUFN! ,=AU" "GpR 2>FΎ߸M|ІTJJl7yX&flnQxX<|a =d M!vk\ygcvKSxǷ\pd|W"76/ClnݤK;DdO_APZ/u[:]^hl; lܺ{^ﳞo|C+-7d|jS~q9ѭs_!z}&nz_L";=mq.8/'qtI*k֕C<Ne={$̓m=/hPBSQ{`|ڱ<apE杺Qâ1Xlجf" 7<1õB⨢qUp9欬uz"peA0fRro@"J)Tǀ>=1O!y U.pӮ$EZz! SbA@> TtegC4*Bp`IU{V+a..y$):@V_\Mgm\~h*LYR"&1&iDj|IY+@ P8M7o4^) V'ahQ ˲Ю0pC`OUq@*g=$T6aIQq2άF.![|Jrϫg=fXLzb!nd5JGԺ1^=^ՆtxIyMW.06&IeL^4֢ R$ 95臱yp⥘ @׬_ w*&%r(*{.d2f [TE2I`a]Π0%BrLZ:Jx$\"ʸ9q5ͶtIB vΕBUuj+/V!6T.] K )\@B$I<)%:v2ʠ n<,>e2\J@ ihդkIYB%WhhrDeRcCaNA91#z'1AE);-`@qD,SVm!Ip(}.CzfɂcisXfL*7fd2]VnӢr}2F3}aCebA Akyv"3aL,K(o&aP";wSؖ+>U A9H@+8F,|*R&5r)`4^{dx# %@GS tj1R:^J<ӏRa"Cx5 &O.<_HtTGTI-!Gn,4HFj:Y]-ZB2C$s%>`ƙxĶstVV .Tr_usm aL.ْD<k!2Mc4COb@IvѡH?7B9T}jJNgN9jfR[Lm}!]Ql_e4MުqYSSL-& # E[J(؟MM4(usBȐ#Φ XSDf]#b*TWq`&a.u;BcW"hD<sbkJ\*NoJpmYlM!u%%aWYT%\l(4'Ӳn3칺HL9#,& O &a"}S \݉}L:]s\pKpq4;]yhݍ_aYaVzwu]O> >탗(ӵ1 0Yi&km=io@m'4vLKVB=:E4q 4H5i4i4i 91 O9n>GDH!Ch(f#u!T鰋𷾋η;/.]ڋWw^BNsǛ72to _13njU*j* jْW*aEA9TZp`8-ȅ)V`AZ>@gT6;1ZVCCL#8mZe}lyPByN072z Ý`pkTjc]ecc]o^c^ 3eaа2"x8{}4h^]wϔ=g?/gˢ Q]}IH# <V坫MJ1Sz;<1kyZJo觡v*6JZ"5!\A00\u_!W]ر㪫 g+~B`gd2rdpF!,[=!\Ae \'S:ǎWW)%i:r MW]헎UW耫g+uvBꂽ 6Jફ2v\u|vq%MW,MWi2jt?J\=C\yL)`a?\uӹ^㪫du3UZLiUdp&s5OJ{H\#_)y[ vr5?-Sk(N,-p\QNwOVXt`lzћEEgWOx]m7g d٢gvv~Vw?[ :ث26|OmEVV̽Mܗ&a^U1{ʗjKt퓑6EoQ?skSo? d*?.zw;Q [r;WO-@laԿ~ KhjnIsC, #J̲W;=C>f譻ؠ4ź9(IGqtgPf"ֽl[˼9gɦ0o5uq(}^5zmk ^N Z*sT=ʠyJ;&+ Z㪫ے\}\Q׋}lS7alYI1H|ETmhEoVg??2h`s?S\j߭ jYe-ܻJ/?-u:`?Tdыyēsd17/yNEz'K˺85֥OU[1uKysŋm8~ęx?{N|_|2?q`]ܺjgo0R-k3o{jh^}{w@rDm 7̇{f0i8nV/6牛t:cq|e_o~-*0P_;_Cv_DKF;ַJΛ-ȥΨT)G|$)!寧yo]$ic+He}}|겖7 _ZYt \ \}K,*h.䒥a\l;ݵ}ItWJHBhBKWS*:ABD?BߑdA%-{7 f\.6U`*uPD`,hogM%Dkɨsd4ި'=!JVX'Oւ^2dtF J(AV(&'IjѴބ`x[C秧7g6Rk撴qҬ"RE7[!IlǪT ce`GCƦ#X3bf hn$DtL9C)F夈!&3۷Y-LE`&$B;`61R}~X. QyrEGcTYKk^ZMbO5;_+_[7 VhuؠxK8~XeL:[H` )ZS]5+^'"%V㹹QQITM.1ZT+ɑ z[5jr-PZQbPc[M6n "NX G_Gqu8~ Zc䛈iYMjRK%!*2" PkU K }2VWAHTA@ FO xVE6drUI xG-6#%dR13I+R.bB4-K +!oR*KISpdVl- bY|B,[шBEDGx b-Søb;g[A <"Qj`%WzSdA] <ʍ8j}6@\ҠG\hZh EXL%eUpT!8`Z*+pJڬ7,CQΰ쳷Rw (V&BuΛweH]faOb0W K%bĊ3d44pΣ$SR,`yOȠ8W_Ej&!xL|1!_ȱlYu3(GͰ B%.QafjސwS WDHSz ! ) @X&h 5nf2%곙s ykJZ<b̂#nfQ>T.,b2WCwpȤ`,{.N"jXr ` ɄF zR ܛHSќ*.0S(Y LAxЖe%w= eCྐQU*;億T%b7Fd@O5b-T} Uteh_N>pB]kGv+F= e#cQQɐ#[Y\áJ6ef7}֩{tlBZQV+Ib0۽0 {@BЇ|p.#hD:38oG 3RTiƬ7FG1͋BBm~U8`āvѴfnf.+赫=[ݗ>H6}`-$4>*%$́ꐅFdPf%Sd$W4TRi`"C u2zM"M{Fa,!z;dҧ2HMfhC T—蠻%@_>УLG>5H)"Pa*DEx(%O:J،DAuIBPXtt"+RPl5#=P=ih}NZl,4 ކ'3M !ͱk,F9)I HQMfWbBHJEVAAgUkUPiXY" njpM"β1%VmEs_ /tXI3'YTg& h((񖽅jzi[ #TAH-wRېVm y_q7QT`V- mmˇzv3;rֵiޭ.g\G$YHj0uQ=R7[`3= ]Z[)3 >4{$\;-Y:Jm5EZSR%A'!O9j&h3$cҞtPía#[4͈}R@s8nH/1zsh=T'ȍڪP[hO)1ʕ,.Ԡ*1c =>AdʽC!>Bv#1l]`fV+ 6b4M!k3b?$oQ^Ѭ0p0) Hhf,tXL:=A+Bb4#qc8`yPC%#ptQdU)3VngPKVwmom+wٻFzo z][_nn&k6!Žݐ n%Y/lBQp~zm1NF7l—4Z~Xl"~7jC_BqOXOOtb•^V7R+?rM^5BѰ[}%Œ//g+tzَlm޷]PuYWR?ZiJI=jhoS;JN+PQpx WhV#z6\]6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņj2pZp_b˨SW ~C>BxЋ4\)Ddpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wl+-uc2\ic\3W[6\]V,J6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ WlbpņjB̘ WB"pZpv+tlH͚Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\괆;ru0x״Jv1}ַ(wi.7Iz׆b1üSҌ#3eozdErDtF+X~tE(v9drA8&={;"{[s61t"{#Е =""ލi5"2pvutBV=+G# C+BFHW1jla]+Tf$O5_5A(r1o/Mufcbl~o܏ϞR}y Vgr5+¾nM UyZa;{a:KV]2mnu{3T.SwsЮ(oEjyuD %q])W._bw|dfvsOuZ]{mKDR:!=ХׯپzaGפ;SDͱ-5*foڅ[ Hܵ,|EΝIu1վeR?} ZukպtKe$A:wOjN9?sG=9V3跩hQ;H ӕT-ҫ?=kPBZeF$ p0؀h$6vPz%="ܞc+BJT2]] ]#+#8F5Z(eD܎J 0筀+ ]Z5x"V0]] ]9mP#+Rc+B?wE(r2]]]y%ا;3+X hC+BiUPƚ8"`3wEpCWW~P 7;"Z %]?mΟ]Xs:=qh?Sq( LWQ=18$ꤜ(=Y(#i_NE74NӄRK ievvDtFz,tEh}:].1*)`Ɠ]`BW@kPtutex(7t\G# SC+B ҕJ8=""QWatE(K+g8~~Rj~c]yZ_n`b4 Z* #;Me΍hj?BN-s8V}jPS 8rQa: n %V7LWHWF^ѕQx5\X h?NW@b@2Q9&":"%+UvEϾQp͹ǡՃ" eSG7\#X8t"3]] ] O+n^ ȳqJyuWǡg2rhb0AWꓛ>Zp-]N*9J]t=Rzˆhq=> iB$ҴtB? _7>CotDě/{~g][wջy}qDerz;ܤUy}W+*ᾉ>r7Nd\swъ*|1/͕ͱYv|ԯr7iRwwmw7R}MΎ5klDLDv6mo~Ps/LuʾFQͯp((]6 {]5~x{xb{h {3>~hw\YÁmQ^eb~`o7izwdV: .q+jtWg{][F9py:ۋd/t@@9XoV啵hbj"CIksu@rMPC(NANn,;OT,N,Cj*ل٨1K:܇/*>c8,Bk]|ΝuMzz\f[B;[|7kϐ_kJ(LRo}P^hHjg-%zzNNf@ݮAדCW:H=="w# _>{{3oے<,ۈ-l;k|Ex]N(y/^ Im&/iʚ,Jmu:Nڔ4ӼO-^,LօUMvڮWݽг>tKL' q׫%#h[ܭbIZ͗W5?wN~C4~m_3&|Kz h~+V_e`?އcA&va(Jf}-v\Q-d*V,&wb &FuYz5!n[Fyk@h{@/S9ϓݝCf bX;J < .r|kx?zqs\/WH> fgQp{YJ/&ޡ7q~r4p=Pɏ&)l(eyI@.k`* D1%bwC6\{1mʗb^ ǭԜٟ֝G[0.Y}z5:?\2r>z) ̦uC ]BqKh6!68Z=Q:12xLy&erít^] T68͚䉳s2Օ҅ev.#0V5m0'iq5w߂C P_L!s E 9sh684MMM}crHCܻg;Q$H zBEH LkMuhxZ,yøhhRi~jXj 1 DsQa6j͙D3%C2|b,9A]`-e[#g:qwNhwKFW|Q>NPNQi,s\"l$s&_۩3 %8-Eb{"X\d`DԡgM'om} sݛ'of"{5;eВsգhŮ98οyYM}MOOP+ B g/4s{2`fK9 cۑ9hh Zw@$8hP'=M*Cԇ(>&geDdXRjjdbo^Pcp[*ehE)RHWڳArpS<ɡlr}1l3hPȕ(VWVx:-.Cmoz=y +̶a^W=um]c"w6PB UZ?s$m՗2,]`tQ]=hEFpwfY!,dQskjw ﭚW^lPJ0w7sGuyZtO a6jozwϋ6ngX/9?ϞΞ*+Uϭ7ϩGPLPO& C&W꧒!S> Cvi4 LJ,Pjb|,bgȼ&:(ōCxXJG^b5ygH\~"$IATGh%Dt*'g6XSѰ i4AiתQu '^.JQ=0\>{3{x1d8裃_oZ7}*`x#AolB^D^FUT ^ mBKJ[0A!Fxy扏N1t)I"aQIcK 1e )4 $\XJ` Iqa|Ǥi}!O-p`j|\[;ݕσ2|ǰd* TV)O\L.Ā(R{MD_%z`+iG'ة<_SC-U hQrZ}&PK"6J4O+'. 3ߟw&d䥞oHBd)&2A[HHYyI+RN[̶x4B_ iEfmGԝ=O؜Ȗ6x*E ҍ7˔:Zd@1j&`SBiGH$)/>_錷!1[azLuBo{i2>e8qfUq:Fi<9ɴ\'Bo~Ysj6,J1ћ@BEX-^>=aC7d1 µJB2=4PC83 1gn܉ZnB{?M||کnb۠ `Dܫ ٤ l^fh4jBnj& ql[5{4lwezWNĶկTĩ^TTۦ"im]QF:WϷ`EZ6h fli[; xHZ]XS12-婛#kDА4.~4qӳDNB",poywX\1wLѼ;{`jsLz閏=pO p1\`]|`Cm[QGSa`b1VlR.]7<8μnxeD"6FlV3T' nlGOQ%4职IC9!%8175 Rj'>=2B-m(:g**ᩧ4 BQGsjpap%\sCd`tJ90b ˋc58( Pgi. VbY4YQ蜑|uډْ<q$9g-r ^MX2Z[U,IT&me9krV՟$f1aAexPB:AA3(χ"2p)W_ ׊~dR<(djbW\%yƒYkLE$:S5S+̆I =-> )8Zk,c45#XV(sC0mBPvqT[{ā[{=Joﷀ- SoQ\;J"W8VH}4Y-rZL;)$i;ik/"]NX_k,rіo~(:pq Qr-TʌJR%RӔ ӉG{0t!JkY:5{HQY. @) ;:M8RP-m yix( ^c<"J ՜(qD` PAyNuQ'/i2*PE*E@, ޲Ht ő(rl+05uZB ;\E}|D EKh*m٣BLZ;J%iiG,K&'iDW tt,+#3&@kȾ$. " /hjX3|{fɜm*Q< " Nk4:a(P@q%cPȸ0ҥLtWOLsI1nFi&%XA@?؆@b!%ȎףqZ \m"Zm@ c m5TyQ<(x['H !1?HNu.8!pc\u"FD8PDOJ FxAʇ9WЋ) S1x|s,ʥ_pQgY]T꜏ro~{W/b3o߾O^Z`:ɋ"X_ o/s ?XDirYoWo%Sfڛ3r29!W{zxzNNޟPRjQoi-!ǫiew ]EWW*PӳYavjc.K;4Y΄UezS,UnSs蝹 7C=;[zs+TY0s|*E9'k7SdQ@4-I}w2iRcIAz.h rTU-@m AcMͯcvhQjA>4tD$@IN*)t} F{g_ A*1nP&ϙ\lJrE# | aje@=dnt:C6Uq@Dhyp?Paax{xR&oQrʿ_<0BȋgZe¡xr}/NOW_r~z Gq|LOY% f(%eK^PrV.k,d/QnTg:໼⃼~=:YMZ]m/[-GA^ _y|X[{h9v~+ʯ+WU~2`xy:Om|ߠ=V2tYr衇eoQ=5zc R/&+˓^۳c ck|PHK7ĆBYjɵ D*3ƒqWB #Asʶ lz(W?OptQw)6 5D5my6ݱAxC=41М&gTȤQTPf2)OcZd @􎈔kk+:9#.*g^:&8*=KnSSKuQbլ+:;Nk1bv^tcicA}6ẻRu&{sf]78Oh|f{a@?e=W.]9U=Il.z|݊n~Xvz{p\=sd#9&.Nʍ8MrrE$b?̻*k1TH%JO*0DmqJzJr˿vPneїGOrx{_?6{nƼy@ӳR7]I)8eD F)ӵjّW0z,ּQmw(ڟ9oyߗ}'c9j 򋎾/Gvf~3!+򛣣?-r^7nٗ 6+Wv  6 wy;7Kgҝyr^9M6lOwSvI#o+'ۓ3+8Ž?]v1\D'G @ɛ ÉS &Uh] ]__qd_*}VՓәcIۜ05JBGKjZI%9Y">+iQ 'PIGsBV5Va<ǽ_YhNv 86/Jָ'EIV<^nѿL|(M/?׀2cɤX4&WJ"FiB/Sh^fWfT:U{yFdsE[[RD>Gr"RUiWVgVΐ}U};A, %nz$0U0g (a. DBwֈ<1<4}:?mFQӇ^UfX;V?QWQn]3.[q5HYg %GP5 \-ؖEyۥKZ勬T/}׭I/3'عvbvDN>w˩z&*k]f= xb4dENlf!ש Dv1Z 83^,v*>cò4ϓKwBbU:XI%+L_E9.c֜|Үvy9)瘔3 d ׼<'umeTs'B\j]ɛkK6/RTGn F,zu_zہj `vZo . S뭡fA^j`7"rh_>7!mx[[j6SScJim@^t=uI-+bS xT\#D6k.SYh*}#>*)MuZ\s:#ÐŌ-tg3HOg'LiCWQxej@j8c,fY ""Se;Y(ƘR_w%ȲSSr]G'yQ\ Gv@f^'t 8PwQ_cRQIRkҎ=-qUyHŒgkCɡr6DZ&ʝ\`d&9esPew1`UJkHY RԑI,Z"ڢI}i~n^ԹՆe6d(tyG1N L*O2MI^Bq~,{:M0%Onɓ{4Ow`qERIR9%O߮7|})H_L/4T4j'"4S4NO1EW.Z/2hݕ+RDb!2$Vtbkr .Q@rVIb7]{6v8^7u1΅];3".0C+>Z5@o[KD*ՐXфoj]´)L}5V˯%gbQ1hť$4`E@LSby/2QxbZm9:]y:"-HC3t\Tz\UNb)ɒw@&0hXx%xq6^cW>|Ȼ Pmݍ ]\Wg!\K=YrFq %q?я,TJ>^qտ6%{P>N}&! Ҳ\Ti"kQ-h֐K䁆AمAAYAL5Q@<ϗ_^E^2*$—2CbT҅0+@R+9JU+O\6 Rpų0J5 }@p2dSդzߡtz]MUnvvE ʟfb,dY"x*̈r$ Sh]+ ZG4dzw)2i~# Z? 21_NR2g'妌JF{hx=9bO/{m_;G nnS*5i4*ŇOUHZ]ίv/J c{?/\ IST'JO*ph{,{eY1;4xEy w7x'!=gzXxW읹] Y_ՄqC 5' Zl4h?sM/Ykӭv|.^18fi VN5Xl<\ RzB jBC4wBCiZ8@y j_!J] \骡РG3 ]5/WUCPz&BWCWd34] ` .Q誡57J ] ]F]qKa K=%Zs+AIjQWHWz#ѕujWQ誡%;]5 9p+8Q9DNW [r$k@t%G]5v.fj(J,t5:]'N~v;=vpj =en0t}-#S*OZO)zK ^O 30fnpBӂ4̝Ją[^O%;]50 ]5<{UPEU"]CMa8 ] Z骡[BWBWhU0t%pAQ誡E;]5:@jZ$u%}X ]S1VJPo8|ܗ>3·vo nRnނxGWs]OtSHèWUvo폭Zpݹ]ÜZ9ۅܽvo$И[`+r{7w;rdAY_|xp"蔫^DŽWѨRs&p |{_CH!: UJ~9F[u^)ܮUʯR)&MKbJzQϝ(WHW0(ѕvj1t%p=-*++گ܂JrM]Νe;zt gpVZ/K-Jvt[1xVeL@S@?juq+Nߣo ')o~v#9'˳lOv@ǞT8_E!dSt*o+W{fؿI?\Z&wg !$g ?ݾ5D_hBEt&_mo7~8iWxwr%A/TrzwmzK}w޻SM\ް n3ý^]le}kSM?H<쉠}"9Kj|\@pRawUhOʩW=:5 6{|ӡ}|py*oo6RMPȫ@x!@ެz׼wK(߾{}&@w+JJ5_A.Wհ};]ӃK\zC9[hVlkxñy*'WG78r0:n?)u}nnME\{5{Bv_g7񙪺ƽx?,,gGܡ|c~Y-#y8!GA=-?>}tgP. ]>Y|[˷IcrlIs$c &KUgr93*7QqJQ^|n i|ao!t%_:ޫ~鞨(=pW9&kt8.[J&b {㇚}9eЌ1{BR*ƨTeR[m}ʹRVUui4FJM6;E &FM?ޭ-5k|HydMs85zr9sj PE2c֎И7&{5\b/9%|NOD4!9޾ׯojuT=Ĭ\ɣa7^QA4P:iʹc0 I]4 4ʚT{}bfgkZA(/D+Ј*3I]S:?M* b Zڋ#Q1'Bk} Ԝ(*U˝ֽS $mM ;rҊJ <}))Dk(5)hK 9¯QZkdI^TC+ G;%XrxQkU {KFC66WѓvzUQ'rd&ŔbS>drMeP_c!Mɀ1!KutALڳk-׆R3uiLZ%z)/Vь ]d0`g,a{PQAQx4@sVs+`5GeaD9qnB!VrYh8Xɠ-] 5![]JD$l ]ȆfPc]=^џ^6-+EGQ6fa=g%!*T.JvTZ[ELN=Rb `; nQ!\5"%V\u `6+ s[2)G!PꛢDd(MW%Ce:|=cYtB1+; fZEoZPB]XB@LQ!ՠB+~X7(c:ZomҔ` ePDdBEA31ۢU:B:÷=t,r<&M0trG !.A 7)ؗI!0,'D5bW r QL_ *A l*3[(Q@HseԸQ,K)@Mڳ<;"JQ` ;lɫ@PSβ6vpW.{(uYch -JuIT r 9*9׍!!ѿ99oʨDX( Pev1!o eXGCk>84i\gm|" ʅ_nЮ{1#.UUQf:G''1 Uit"'D_ 0bg3`\u0(}Oa$ UA7d=2[!SḰK<<>m9@GVLD$W{HVX$ETXPjk=O :#.c؃VsHVWd70R8Xiy@{KH`Ge Y%cԭ)(28hGc`JrAkK-"VwV5\2'ҤSL4'a)#t!;?{6e`]lhaX{g tevzƚȒGx)Kt"!6پ "E=uo=U`@iHթͺ|U>z lwDr&%!IxvSXN^\  A/ \2,߁*QfWRމhFVE TUW)~7ͳM-͖M)`SG !sT־Y *Ƣ-ɛkHN?~l}<1VU"iSYu2CBwUn!m䖗Ԫp9_Rtr]˂T_rEpa֯]i5Rwh0vwrl9jƺp|nDnMhꝮdtI)Jl6P}詠_CZ^SBpC\kyJ+NϢ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\=! \!pUJ{Tի\q9p+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Vp%x\^> W֨ DQpWq W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\=J1p! \)Y`W Dի\Aa, Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(BU Jܯ`i-O]R[ð-|;a@rH0,paW&ׇ֮Lþizcl@tUk5*p ]ZEm骠_%]Y#+[`U;骠 CWM =#/NW,_:z\e^藡GTT#J!][i*fuWW\Fb=~xf4-S# 4;btD0K$c,ue6=r<>Zi]8~c0f~# s5MA637$~uQJe"!m)~V?}`o Vg ]/B7pΛ~[jTxPʥ>#(*Q+ʬL u2G%\>Ʊ4-8Y>W x!UI>z-i`oڨ% ރ&V NbeU09lQid/epOЁ},q{fZJ ͤJ8*b2.)OͱO{sx*:ꦕWA)E<6@kɴRًi*O.iA2BZ( fhe|(C }Z((¡W8 |H#҃WUAk{OWpB8++Aef@t%"t87.gCVUA^#]IbR2eg.heR WHW %UONWeߊ4 \1dUtUPY-j1 U+]cWekЕFk= U"UA;A:]6>&rt(B4]=/KWC_,ҏ+toSBJ$]vDY(E-ej^Sil MT Z.NHӯT64 \nBW=]}[5Y[ `;v8UAz8]}Ez)DWX(*p9 ] ʾi%ʎ4L áfUtUPZt JU~2TIhONjGn(f kG쇣/n~\ۅ6~kqQ+ Td[ X fq1/fr W>:8<קm>t /ѿO(yZ e)@?\|%L3/ NzU!8a|̞H JQ_W}mNݼBz<]_B;epvnԛ Ayc?m9-||;loYxz}k~>}Ik]79G>ѵ;7Y@eWs؀dƼMTVl_<nnzӟtpr{$>TRy%U<ݧMJnRDg>|z99tY`lo2o},}ć/Ƴh>bn Wv?S{ _WOlPf`S.'Fk6:[/RZLڝk\NON$4; nx:8Fhf)oX:\}Eqrh}_WWhlhV }2Y2 !I:|05u)Q "ZMpoli)Ido[-ђ(:+;BxJ Y,E2);t-0 ԣY[62KY{#=&ܼeAuC"T8zp 8d OcZaLOftv.gv{_(0(X|2% 5H2 %τP,Y OR cD둏S}Is^ך lX{>ZvL90nBs/g?mM4Gg'=u>( ]\A1tn{.{~ˏ4R C< WɡL@'@i[s8& %eLv>{q)aKasfv>\ʓ6qxy6.25b6hvxç?y-/ oms,FzO4cvf ~ ڨ<QJOuLyw棻R|4-_'x<6ߦ%iմ4`}:-0Б,Rvqr[[oCr!<w1,hj[y:.b1>\5rmP\iߖnԕbB TmmWG!>M*W(.*}wsA fQPEu%WWVR ct=tw.G.JJbvQh6<)@ECJg:Jddd|ɶި\Nnww]SN^wL-~|'/M.VKs2;[ܘm9~(+?fSĚHc9ERFR H*s LKTA22:^G^ʘ9e@=)` dѥhrNR)\;bθ3@.|\FTmyY"퐚"_^d7'Tn2]MƋ2(f%Jpg.Ki B#AEeeȯxYk(ɞpYMtP+5Q{CI2 t؝l[)f_X3ؗYǬ͐1S>[%#C fB=i1Q@*k#ÉE2N0K]# Ȑ!2%f8 >p>fI:i:χuR_{ 1;}ẅ..gUщՉoWKeZɣ,*bU{25I NmVm˅g|'dLh4'̇|4^\T+XggV//yQ /"/9兌FAh֒Br+$ȔViNb؛ήc_>܏օۯ^,yG;8i 8[הQ(~ʐ%.;V|f >lZ(TeZ(6TBk]`*EͨZ AޕAA^ArHA890.@$Ų "!LaS,!TftW3zݧG s\_V a K(l{Cwb.g9>﹃?a/1391$&LEaWI 4  PKILÁn3`xgU*ku g(fJE5՚1we@C 5B|6[] ֿ^*&s&ܝW`fB%ʧYhyJn't2R'TRFWVTHUS ,wI-ioU, N9{Hn`?yW{x&y@‚GQWgFZ{.i.8hn6Y$U:֠\R:qsLiSģQ5:}Y O;妇[~y,ׄDyjS"q'#LJ xe6 d #E Dzchڡ# ٝ_,7(M4qL&dҡy(jWOv9# J0@UHLrOks%YeC2F0"zP#"JoΦ։W$o^cIg|7Ϥ0ɬsM<C.hRb+aa9.hELM<Շx./eW& N\F4e$qPV 19+''ÒRe3;vUZ<؋DPcDDIrIۿrLH]5';%ɄeR.U{3ar :s9#Q>9>=O6JQTCX{UH(.Tk@PQP,*%D?i]yb?̼ qj4B 9f>p]oYOQ\(is7n*GHVF)&!|YNnj9o8bE ȺJ+͊SJU]E寮 ^da-_o Î`SOvՑ]Aȡ{xҼMarW\cO9]5r-hRB\]z: /OHY\}Ŋ+l]?6t:mlpyׯf!$ݮ[{=oCS}ZLJ~9o[X ϶ wlLq 35.SԜLtЂָq<,}.³mE5𙬁2A˾ϋѪIP3q{ 0 lRۜ?xU}IzO~p7 ;l~3@sx3Gr }A x8(YatQCG(=RZYg]omzuX^5%ڈ@ռ-ж颹luFhy,nj\qW6qM]Ώ7?P'L6jJo7hok3jtЬP  6{95 _N࿽GlԺŢ:櫏k=`G5צ,'UD)n»/*_<ؚ?Y $yQkPbpGUZ @ ř T4,HMPi-M J..%LmaC3Ț: }fx=[odpQbY{s/mRos qecBxfTQih6x)%FzULmH2.ߍ1C%E% :/2&hx(6Wpa)Q4$u\ ?$7GoގP[\\ rw{y||4~/ 1Y٫&9UK:1 J^jQW `8>Yz:q8A<$0R @`UNszr)\DFɝu)ye;NY8sxg%RYI& (kDC2GTwF)lҊ:bT]kb]-VG ! CzPTZx$nd!y6hd?w2e0PaFEڨ(š)XP+I ţ'R H'w )O;^zG8\flptO GGɅ`)PE@ywB_Sp΢!KO8u-}]?o/z_Q!`8a D5N>$}AK wS9`4eD-AP_kDs5uƱvq8b FģƐی[β?l5\6#&p5!hdUj˾Wmê~wTsmk_S=MsGڶ<_ك}|"ZcM){D2ǻwC5HH߫;ZOəzrUސ_mJߦ4\B)Id-}u|_=YlxwgZs5wCLb!x{֑ R$8l[@[n: [Q&mfV/ݧ-']<wʚW D Fgb3뫉8ZlS6vVFs܁MI@Fʤm͗ kR*c@< ]aE噊Jx) )@}T\@ qi8\+2tV`|s}(5Ǝ5xs;g)+ VbY4gR2<4p$:g$_v""%T$e4$5LYBEN KYf@kK %$R8qblIg5]yHdzb] hi ZRR(C@7H Sp9)-ב@I\e[g/KɃB&zUҚj'`,ypVTJ"- Za|I4%þ3:HӨUB .1F<3 ke" =7 &i80+;^W{<^,ouL'N DqkEpeh$Z*!x-vRjKIvV" ʾ$ѣC!b(+7,&1 ǭ܅$28n#>*B\ 'Ri@*$X뢙b2a:(GSalZJ0u/˯]\]Tu p@iKW:ԂDv;8$t&lbb*jhB(Ƞ+5PO|{u@pPvp@1AyNuQ'/i2*PE*E@,  ޲Ht ɑ(_ضriԚ:l-rqqs.AE`"?P>v%QJKe)7=$+b/gG+eGw΃ %A钏W%n4+*:sʍzp5wdYb|A.oF(j9ų0M\5֒ aZ‰`F% )(.3v9 {(={}h.fWAI4hVjRbE* KdԃmJx)6RY:li+B B$@Mv`L&Q2ϝ2>oQ|,!bYb7E'2oJS9 JXOTl~Lʭ~̬NY=z~m~^4p(aS~ܻGsɛ\jtqǯ.3gUn-Z]Nj&}ߩ(7wuxY촱ͼm~~0>=ZY_IմZZBW`Ά.@&ԂiͯTF78sulhjN7x394Y΄eiC_ooee)gwH$qM*:?fj<|֏v[ audr9-YwTRGGUT["u{^m}OptFEFCf-z~3cNܛ܊OlFr˅Tא]p ,ubceaAn|npwV IæcI,`ljs?d Vn '=1Y 7HM)R3@U0m:){"CD?K+7 b{Iftk h ۏ Ƹ7*񮶳V[;IF0W5\FcڡF)Dޓ %9ilOm7ƾL;j:B G0w0u* N|%I3|нAԡGhUD:ID@~rR /4h| ~ YER;LΘaqW`0i欢#a>4τ<=;gD'3cvLU1w]YZ"^ P|A9E'~X&NbF0^D#y(\$DL s`,ETي<)q_JYB2wjU7jWűgGÎ S -0-Qk󗚫Qc ?XDd&qJm7 "63ŇTT]X޹WPj3ۼ&yZ2>+,כ;T^EV*?xl ozD3 m6nH%Mƙ:U0F6oa~zx/_GV?4SF-ѧCH:a*CB'\Rˍn;ȅ肴.Hx6'! X[cL{MV\G5qGPǶqHO|ќ eJͩ^{pQj4H-@h.f}m@BiP$1$ǛȥOv+*'dLb NK+m68T){46}6&-/@x d鋣 9\Q[ Be.9LD29b5qv̐xQD8r^rqs_wGs*|o?5k 6&#I9#B0 2ff%`&'YTVK&cvJZ$`/XW`oMtU=~ysUA]i]_w ³][isy?)lq`r`x _=Vڟߞkܕ[=^:ӣ(>'_X O@A/|Ikq]&$%-4#h%SLd[>ZZ;k;\o:cgs웅F/ BO>c4]C^=o{tvϞhw;wOl~SSD-d49}R]՜ 0.Kֻ݀qe|=YJϯ˓ٿzͭ<.ctF"ʸ%?Y9ZQ 4ڍ.Ycr}J"Hw|&YqB:2tgmP ~'Z[hd?\l{C$d3ՙV*RJN թyRR9!u׍VgriOo[Jm˽Cur'vm˽dm1F{ 2aieڀ?+!K,d9rV%_An[ WX:˺`/3qeoߙ3!T 3͇䣧D15a;V-*˜m\|VWYBRLJ6)C02IkC;xz\zPzFz\RɎG( R9>gzwZUyNȉt}l&ْ#$Fg}}`F9pʢ`)*+jP@d]b{fҬ*;6{U.?}n34?Y:}7֗Ah?j*$%0uIbz:|JqB2$0j<\WYq*2]ϕ!)>Wɕ1;Nyǐ[_'Zׁ牋uyj:p<)-\gjߩyF Cw*7烵^7s&9!Մ9s&oA}3g[#=Milhh(ܒxBpf[!nQzv9lZuL(畣ܨdڴ3ޑ̩u&f01OCRUN Vt٣G Mɦ|pd6h\{L?^^]_Ra_3A=ߞE oRD@8M*Ն+nVg(pdAJ,S28=3Pu) ^Hg;\v/×5p\ I.ɡǶ_9谉i..7r+j<|].AYFBv ddH2`(OVLQ[O S>H4&&q"QY <)1ZMdGmyEl+1lK51[N$c֋dRv*8ZZN(erN7-Hd|MCUM u,5Fj7u*˞Х))2#e+Rt#[j@ݒroj)z뒍Y XcDHօBf88K&R,Kxܗ@?r${16!Pl5lMٖۛV > Á%K,9$"R$Y">I-&I+Xa=]Jsq_Gй.Q_>ݖ{nzO\" O熥_j]c.p7l+.M}sSCE24kHxexcJij\ꍞ0Pttѝ8a\" Cc1ZL]JGOi/Tj$dw)YYoGXR#==dcrfi)l`3)Z()RNA tT5L[bنHy-&'Bɵd$Hp1z 8 5}YDKӤ.{W =放|c~VVt:|Ѻ'FɗyݰHzo6o^s'VzƏ\\I%;.U{A͉9LbDS)29 T) &et+0*2*sduV, It|ԑRDNQq%.HFj[Tjc•DelŒ?.b{j{4,>74n8 /_ #6S9%"A@\p|X)&GAEe@"x-yHidz sQ0IIQ`S*xQ unPYM@YЦ2bWg;b(]Ajڱ/jʨ-zg쳕CO[*&G WFӍF, ΫaF]đ̴ ::YxC_RV 5Qsy*^9g*"ZS7N\ F "I%ŵL&h4ɫ2"Vg;"\$jO8mեXg5-2.B=.>(Jg$VZ̈,HVin@Bx \Lv싇Xq?<<,}jȭ?Xmż $>u1)5]dMoP sV"K <[З'Qv'9p/YD,sN;r$:),KY_E|dF?{6俊fnqwW?w9`;p ꇭ-y$;pIJSeʖLL6EW𚡗yHHG >KmzH7^m0 -pH2WQ ICtd`!%K^D<8MRg cw`dh31hmPD̶ȹji (As$1Rx8p 1CT^r ,#;± AJPf&ӑN80q)Yd,k +YUbš@{c]]VJ T] ?=381'dr_iɋos.f0_t&P.rL UJW2< 3ksFq9JWH>U\ fQ]kŴ 1{4>xs֥~iǰC+S:nd=(oK|E=79DFLtBk *pXpʣa|FZ%oc||&y|C 6y$$hbºʓZӉ:Ì1t%Jzz_̘{\y~kôRE<*L! L ")eJv] nt )΃)t߼IP>HR M6eɉTAzڐ_aap=2i}s;J9q7OWO{C`\k5z"8@G>J(%}>эG`/%ɞK+˴dP!_i΄%S=APbߐFLAc!qވVt$ƨ E^ tQYJ_$ḋƀPF*ZyAd` zQ2"{}F tiUNT-O?LjTM[4/=4åeBUPV'Ujd (hA&er:V2bLVF>e@;o/ۢAv/Q4v@G!t ࡬5۽#7Zi {.ǿ`pUJ *oVFxh|,o:>;`Zj$nGZ77 -$3 Z@D%K{eCf"K ^_y뗇` K++y%c^:IJc&*22يuG,eoJ7c/i*,oG/kГIZhyi)]uۿh> T#SW鋚G3;L$tD OUX*]BBAU:EE<2 n( \,؇Akt ![JRi،/8_4]#)"kٍQmHD{Ԑd c&(E(W 9#?V`v <;Kq\pOT4rkh:5kp]Aq?iTcm秫ŝ,>Go)P{~I'KvD-U֖SWȍͧ$uTuYy2{K6obPћ92rftf=!hnz_|5[̼2r3?̦ -nNzmx~kN~m5~v=L%k.1tیOE<*瓧Um09!pLwlouSTvW?7)GO^E@eC׼+w+mkOwL,'jiH9NT"3{=Z6;ˈՈ}}AWӜGz:߯_/$}2]W4w.~4OB{*e4yX]N{A%4[s54=I p>hvv;p$I̶U*hL>117l&`2]q1ŝy_ʊ_ã1xlvجj2 6/?H!E ]Ö7x->j:j99 SĨǑtAz7P1DlNE Cieoznek3q'\ᚡTŸ4txKo Aƪo6SbI@I J P4usmm莴t'DsTG-,"0V1%/,f3$2Yi9,І'p3$+eM([RJKQ&1'$ T{s;-:,oõ Zn@`!$Ĝ pIDloI@Ⱥ/9$TLƉS>|-*-#?*sƤuAgcrǬ2lNBG:",j`' vRx{Ϛ\RIK9њ^:rQ" Xƥ%m>(^Am;n8?oL/}bNC4Y' F3Jko +e,+7H[rKBz~(*|g[doeZv"^H6d8IƍS#iUO6$-t^U8ԸYy`yA 61 r #re.vzp֐F< BҠur{AwWy8MRt0 = da ģrr"rD6IDv Q;YiɒQini+A59jSxX ɬcTk!(|_A痀(缏`,0GZ#ctH1AKLJzJmg|9*x;yd.M x?^`rqfNs5OY^UF]..]57uzmy74{5U͏mO~FM_~L)u5WJ~h^rһ8\Odŋ/t̫^[%bkZẎ/Oz;j+E~ׂeqT4Bum[v֧4{wJcFJXm]hFTˋUlw4wg$$qM):z^ybg)h}h{+wD 8,>]Ĉֆ-'YOTR/^]1)l]TP C-{RNx_ްۙ?|i$9_,l7 Я2#u(Y%)1`[rj~jw6''*@''+7vO[{cV*·ž>sV[Y1eܥ$HLtV5\OΏ>֭aý4'~AyH##KdQDPYg?}oyL,)Z{ R \fO^% SJB)N2Rj#AYH/F(Nbq&iSmBcLGtMcz۸T!^f~Vmy}*t42 }v\tt$t@A/ӋG8}9x64%a dQP0PWA݀V2'mtt9/^Ku8Vvx?_֯1z6)Ē X$ƘN;X 8w(YSh s;@:Hu.3+"Ԏ`@y!w!BrQ„EeQ6ǠIS $R^Sk +H6P $*mYnNlt-o 䢥 ){F4(SbAet.y_564I2zݣ(!<(ڗOgaou{3/@|~o?k ^k 3fƐ*&V`%^ W*ki=Ԥu-YHKTL7+=<DŽ P{zo|#0Y˶/ [X?x1Iuݗ"Ezɏ~#,**ppiJu/gI-c~n}3tym_q,K^Vp>wqnNF?/?H[-dyypu@ΎDr2. `9_/% F5d¯x?bm]_\N֢h fzvUә85l+̳k"բ,ڝֵ'ӑO@֗blNmEZo6|װz0۰=-jl}_'Xhoؾ]l8! =JcFVu@jy_' -BF@ۑ*-:} f3>n[p޻GwE/-%8jB98Ay_,ҵ GшOr)۠|N ʪ(QIιXT*kХG3qvxnYuhe?0]xѝ/Tj;jG*)i~ >y$)V9Ĝ[Zl:\/f@΄"re!sM.NgMݡpfqw8#z=.b@d튗j'6^`(PVHo"ZgRdл Ib(]Q3!T"'KcLGd.rLuyØ7Ni'/)/+vP5*&TmUmsZW fG«H"" 9!z5\*•jwLn2ȡ)ǧ=M`G6WO㚫Ik8iR:/sEO0W4}^ %퍹ڀ!AOReF\-p1F^vWwCbU:vIZhd3&bP`=Mﶙmm/S K"er d"chZt"W\P) ts6"SC>)KKO336(x8ߞ0cvzv{} G](TX]Z' ‘PIyʩX9#$g>f2mUu $K% Ōie)eO6ȮH0kgD'b K .Nx幅ui'{vD%-:,dlr=w^"6y(Ɠ_/\A uR%^`C0޻I!j i2T|[PڙBtM6bdOZ28ZggE:<ԁ[N>:iɾvE`hkd/EᲊE[҄QȞ|?뭳` x؛xg+bc{u,;bωb| n`?WWדȪnW{7Yݞ)eעO_L>9<qm6 IXkjP[emޱeX۲mTk~qg.vqkCR+* 3ޙ{3xy}3L~[ c[Z. 1urvQГ` DrH3AQ:e $ #kGBTO EX L&) >k.ET.&HkhIUhL3^\\͹W;va]}E@l_1ŽSi__zMK[whs=ʩ)Ď䠻cEٱ)W|v~%h(E?fc{(P/#Fvr4YC- (W ibDI:|fȄPCnہ=BɃQXmU0QsY @2&)J9: Adk JGd,;8?G`R>/F̵JrEw5E!{o10e7^~c.Y`ox G !;]$RM%"U% 5th xr !sE5:ֲw%"T Zc6)+$`1$K$5{8 | Aҟ"zr'ZEvx=/-)ITD;6R> 02oh(V&tB]92+ۘGB(uS+|H\-R9IZwN~I+F7ai2Pʐs(مJ!T1KaRg d~ͦ4̱ͱI5sd@~ǘko:(vۆj8``ڝZ{e_LC t`pNF޻ >`ѳWYG˜EKE6Zl&S'H_COu-2A:p7^h ..1V* @F3LCLc8Zu؜T$j|ADʥX݅"-|RmHZl:{D1 ,:v`Y_R)ȞRM ^G4ݍPrgܑsF#:Vc9&gqbfbZ*tPt\Y' 9kÄA ]ΐc7Rw.=:z!t2?:eb65KMABP"}$Ȩ_rJ*2WJ[ %=%c9w=dC\z݁-|e72,F 1y0Z[ (G\R Is栲LV95H笑ޝm^.c+KUuyj-p0U >]x0veDkBz`4Mrm~m98|l*)>丧y)zڡXqcAcoJm2j~7ŮW+^ XL:c A=A;†mʏh^pZ.7UtJyX,(gBG~uVoֽ+K,;ey1 &E!]a)늇0h8p( t$Pn'"V `oc!#6L?:|vEn}-7l}/^͈3经N|_i-rx=祠Vs7`)9wkk9wU̞J!s7s7ZȣYjk٣Wol6̠$PYm*H+͙0?tzXxو IaN X+ eY:7"+rBHLfFecJ`fX9 ]2 (BDF(ɪ0E)/GbtmuFnt${~ ǍQ-[3`r;Mr;,ce4vL| m›{o ;IBJFm-CZev"D5oSo=MvlK9a6PF#wǼ;Frn;VVis_>]_ w`|4EUFQy+Be2_Ln)$tS)\w -G877L3R 2&q/Y]9ŕ alLdM zD YXf12CpDrAH-9D٢NzA 9{FXmMPgYj-:Y%႕ 3Y8*YNU c3,KɖLJzEtgQ{wfdʹ4wF ̥}>\Zz?] P]/8DʀZyy vͿ}ql5$..C3V7Αq CȆPa=4r7a\l[ԛܨ~ǧX|J̧'6>XEl2dןw6d-=:\>xU`,ْRY{v&Sۛȭ&$uf{ץ׏m9Ĭ!`ƒN*& -܁'g-,{?#KKV*yE0+-(t"O6j[gLIn zstJI2.3B판|<(ݵ7(ZSTܕ_&*+&eGsZSdL+DJ +nd:LiU=4Ԓl9ABLq퍉ߙ&-O$Vܸ;kN4O> _++? kW/"U)dF&F0;$wGk\q-cD >_iT5qoG1`Yc>cNjR,}oْ±HBI{]k|Aq:ґfQMKG\Q \\o=aߎRD=O+'-QN|XMrV2xN&t}Ht0C[4w.~4˿L&I-r.w[FLnãx%4s=4wGfsZKom4<2IM2Ue#kIg44MŤ{,x2t^Y3 ]x4z͡4| ]Smբ\Crk}@,2lSZ>R")vY[߀9~p t*y9*D%þ31^J#!$f^'Jp` 7~S]ܝƆLrztu]ux)z9 ПgHPaNjq2sĄYq*BV!ѯpPRh`Kc rũ;b0jɓ?&pJDF*!za ?0#!$EAK݃#(,J2L,[JqI{J砉 . 1'XLFrEvFnZt Yލ6yg 3QK/cEk~z^{fr3m%y3ĖBB>z|(:gfb> ţ`*"ZU̡K%d1pUf:Ԙ^r;̳H{9;c Ca,pN DЁ9% 7^%LحƳRh812*j Y ',1 l.GnZJX:#gO9S]KF-a_@`M1$ 2J3"%!#F_sYBB^ۯE~a>瀚Yic(Dg6F)H -,ZA8vRo'2W$OFGg3Zk 5#:rrRIe\Y61j{щ8x=gχ^t##tldZrGf)!$ mb q2^ںm7SJ, }٩_g,roYOzcR(}̐ Q"h,:Nb e{NbM!dUdY89tFQP?h&PlAr}29VGY #u"4( Lvđбo:`="'24qJ z 8:$rr,QiWY=zMOʿٍYgТ$GIT|>D.{l^Pd8)7A .$-4Ӆ*nf5%|{Z\R(61 sT rU.vzt֐F"dC8 *5C^P4h7aFFbZH<Ӗۜ D$;Ť ˩V gaqK-M{ RY(PhmRVSY"H9'%\,8o u+HG."pyE& DQ)9 $ I L?n' ^N3}XzrB2As$G}y0'-/ZAowf8lx1Fuz#\VW]>]-wRK.?3xW//|3?-~d]ko~مKioxz{ôQ[ w~}<~`VrC3Gu'.KA`2;?΍v_zomU״󻋩oFRZQ8o|vy3OO2-QG>2+ep~rP:?_Y4_۱ ޽S+UlK^CsE[ZQqvf}bfA?c#~dO]U-Y;Q˲Lsqe6٧"Y^^ne薇NXZ=Y[m@n/bӏ⊫|y{;_h~Vt{lx5ИcmH>tU7w5;dDp:K}/!$\)nJ@ej[5 AUwj[yݨji/OJ[FpwyΔ|нBKaUN"'(%N"Z_ ;Eogmyxr1(b_뛃e@]uǧBzl.柞zv?]|wxY'k_Bxf?]\y6Ot.&W|]dKʬjkj7;8 7nIV :&?[Z`ŝP1!:nR.oG'y}Z^݌k{# +ݖh;[Nߝ|MEݼ:_p|21]/daFח~}88Ľ5W|wzM{[<-QhDuq|鍲o 2PtmEuUV j<nߟg|iy'3Hb.ߖu[f=['Qܙ2=iOOK 4-vr&Tg)%a:d%P wu E J~LGmWM[v{P9KRmuaDeAswo8[̫.fSD7OղG\ݬ|U_>3r{jdEƴ U78+G^յ'ՙouw&4&}eQjx}vy!/[P8CmƆ]{Kju_7uYC:he}nh;9b!.wQ#躨:.RƢ]n] :wXNa ?{nCWР7ŪAS}$V4e* ڒJst7G+/Ǵӎ:n7s!_Nid{/H͗'|@V@v;|% &ǫ#ūR{Wo*9)ߝ)>7;RIkMPXlBW\Zv;)-Btd]S+>nl,J1N?Oi,fNgmtf?=ؐzx2yJ}Y\Wn!>EdR;]2ĺ Tw6TV컓~=#ܯ9noOU;šf'|Gh>7p6n>j{cU?pH?qf@,{9Y0O3\o)o#wJs=njۺ}miz{$zF HkHdaGZb9(#N%r?~!*I# jSVըBܐ#T0(q ݠ D4(Q 1U|e\pֻO|Qh;mLJUw{ߍ1u(:c*׹:BQgL TU9u}?C4v] *@@ؕJFW +UѯQ"3芿٪ vl] v4@\F@Z7P@Joҕu<6jGPnǙhطWMX- 4X}tY“N92&f S{L0SH#!Tpo޽H!!K3y1}YZp5biuj-=EKVx\_Zg.>q\Z&?.fY7o;T\/O'+*ⴽ)SGŲ-TD]M&ݓ f`gק|ͮfѧ{yaՀ|y 6/OY5 ߠBǼ-ޟ\^Vy7W*ju~(M7q*:ժ44Ep{U6(*͓u~tܙ6rݡnulL\Uu㘽] m)=buL!芁IGW뒉6PrPu&4 J}:fk D6v] ɺ/{w^שTt%wg]}3R[V:`t] }j jDQbdRYWV=uƷu'l֞Nf#}P^=U5_/+lF Ϡ/lꔯtL7q!-6捣ކgXu,d6h`,RBC Z`\7h)>BZЂrF[8dtŸ:] ֱJ(c[2Et7J'+ѕiF+:j"ϡuJAvJpKEWL!ΠP)xdtŸ T*;BǮ&+ ɧꊁQ] Jh]WB~RYW/+)vVPdƮAJﳮ&+hsuH*] Tt%Ǯ+ToGWz˪ׇf+;kG걶F"ҥJg]ZzsȦta7]TG2 o24;Ni*ĮiX kzB6>Wd>w̖3'|ś6矔1bYƂ_b^ F';ss/zffQ*eZ4, f'Y XLFW;ZƒhIE+$u5A]h8Ӊe\!+&v]ɳ1 ʠuҕ)0\gSѕrsDtte{„t%o7 C*}tŔ))!vvJpNEWBTJ;SԕWF;G+Ɍ] ~](κLB`JgJpӉF/&>'2|;-A+ `u5 W<W]`(jքu%mВu]-@ \,s] åan0Yޕ+us;hSҕeѕ×-4]1sdAWLiltݪG_7/fV*̖Ǽ}89=k}w%_W~>?jQp=F9A-?JZVOލP'~Qy¿vH_o__E]^i{[zDp&Ƿ\yyZ,.X:dg}և ?YU^էG>]m^_{+~*Kz+XVUX{,ͬMM_^뷯]u*_z_ܮYFtXE _,[/wN#a [7pnȧۼS~2o@ww3.#9zZ[$n73AhE}YJP0?ǥB[*K]4ب mjZښm&~{'[OW+$ůaV}wm+_~bpE]̋xKU]5@6ZUAJU㼶T` F16kM傪ʊзP9Se5 Me\YU:6HƖu>ٻ5 ti[7 s04qcqۚRT*465$>!QZ Ayum[-PWC6\KjjD-*.F\k22ʺJGiz 8;>걶uMuDpԨδFWIQ0-Rt!5Mٻ6,W}z?fI zڄiR)'EۤH%l3IGVս][#SmKPE1cRN"1-.bnt.4B%$k၌HfYy[ cVB\`v==Fh4IUdPIɜ; #P2B.\UeuK>0fg+ZAC#Bx``Ftm_YRXXnj,@ɘu!gkAO/Ƨ\tܜ ARUE8USI{k%pN4djc9\2'%d >"fO savnE);IM -ڒBB+4V Vh V\jh)AxK dֶڔ^* QT3,I:gEBO^>Y0F.\>E=uk,RrS@w"R V`fY)dWh! Q S׆RӌT"yˌwh|z*`H9gg'q̶WlBp:^vNi2Qh?k0QLnՂ!WˮT@cјeı&6j ]SG:+TCFֺzz|K05n `S'֣)͛ݰ֞g*X*ȀJvtZ2ɷQ LI2, Pl<k۪nHWLFȌȆv2I4p.F9XBPP&׀V߄L %e:|=@cYf22yn vEm%j|eY z3tWh%Wk 2N[r`5d!Hc7uӝ1(QdFt[sꠠf#n4vD `L ӌ/%RBbYLu>A% `- `I. pq$VZ!hަV0J8M=n2X7άhlޑPS!HvTW.{/$tY ='gt0-(WjkIk $EHH( 6jKDp1th*мXZID:38oiP>,f䥨"n"#ŘQE!jrBQ&}Y;a2P7m搮t,ykW e]@!PIX|LxK K8 ,}vtV2Mt%CRIUFB9LT2'Y¬ 0 JĒ0ŕ p@w JC` Ys6ՔVnӥ!R(0r(;flҪ\ >E!:MC$)`jI. r6R ]];S@]pB{ T GV/ժ{qzmeneӝ1ePJE?n9qo|_: Bti> Qd?T||=i/|~ }O{qV;J`w.lIvtnH/cXԶ.^ B?nzőlWˋ eic;m+\l_J#m{߮_lEҸusy<-6m{>-8l϶@wR n!/ 7ҏ}B wOF@dy!"p M(ƍ~yB(:ԩ8(Z %;Ji)$;(v@Pb;(v@Pb;(v@Pb;(v@Pb;(v@Pb;(v@Pbv攜@[N p8'IO (?9>&Nb9:4ŸgyWk+$Yfq%9}m߶}^Hۗݜ E5oG ,"^-07pm;\οܯ!7E*ê3=ۼ]\]᯽;6rҚ1 P`f/mIM>"Exn0 .4Ӗ3lQwƿϸ,fCy׶PjQ>+5G'}bdJBP} CWǬ,].}~J(yhPP KHc}NݣGT8,rʒ3cSѢ[3T-{qÛ~Qnl#DEj/Nv`+t4l1%'Metc]G&^~g={sϞ{ܳ=g={sϞ{ܳ=g={sϞ{ܳ=g={sϞ{ܳ=g={sϞ{ܳ=g={sϞ{y=)y=o"p:{kh}@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; uuJN t@AZ@@ xX`'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v8>ZAz~8;vE[Mivws~Co`ܥ?+M 6y2.q(\mNŸDZԍKl\zƥ, ;w:O\p=kEy p|H?!9B_GES+RgWVI vtbAړy-ES+2gWN SV?Pɳ+D)%ճ+'ŝ3kZvEjyUPZ1sΛ^Z͝ b8總qvcVUdHC> &&_b(U8iAE^G3c!9XJ?뫳{\+*c<<? s?aHov\zx?6d0!M|vE.kO`[Β.sQbt QBin9?R9_3zyGeձoBqϖ~L5Eظ{1wߕ]&~:^ȽYUڌ5! Ӳ2Pnk]$WMzE+uOfK)c77SwYIY~HoA3C|"~A|_p\_GDtq~h͐S_t~~%x_ޝz1//.'(Zpm;1u\ddRZmv!os8̝PRкodC/s\_j\wݔv9xEn_`o|Gܦ_vs7c{v;`_/o޵߾kWQR$V,I}{;Y?AX&OoՊE4J̹ݥ_ˆJI[toL96kqkep •t*b=󓶀jZO$TRz*pR39X@<ٽc@p=ʙ&ӾTWet$UY: U&lLVZ6\cOƮ[YoZ)7W;GBBۏT~^\ޤ{\ݻDa[;~wm_ASQ+W+e}I $O==Rx Kqm"#~ώqjLFG"8V9/ }gr_I _ Y66>dHu}zc~J;8,.֟J ΩKUR_Y5$WpوdD$ =j1dK΢Z#9¿qHvwر a딶kGήhxy9"ڱx)MlLfzz3, ǿms]TH(Ht`%n@λoϗii' IlFmR\wi:ZҸbߑGߏFM9cLկ_UhǶಇ|~BA[y6E~[uc ]Xոݤ![Y/0wlQDQJBHJsU({~(wأt]9ZF&2Y2x0QcԈBFkrir21Ez|Phjy~We4XaS\Եz·lֲ4Ɖfb@Q(3/%p@)9+] jM95/fNfi7by/gijMٟ<5t`lxNR^"ēJ9ewLʼn;[Y{d[?I}B#0k")etFs X:eZ &`2fufFljg u{ʳ-oe|6p",'݋иdv7:|AK dpY%4 ӄU&Ճ2+jlidV ($¤(jSP ܛL,MH)@95M֮jmQYkAkނuʬӞ[ό{i1QQhi`IEyU}5 l$E,CJ?tM,٣"  GQ1ꄫ^2a5rԗ8ǩQVֈrЈFF28)oW¥h*e*hb!),QU551@Eظ46r}0*AjeI !'‡eSsX#]&z՞鄫Oj\r^Tzh4$3jfYVgY phJ H"B zz7Zq>ԕ>MAok#z_1x$;׆QG яpՊ<<> mG珽N׭۶B q7u}mt_KcM>&  <)qˀk'e,kt.2wu%IVFq6{R]DBґPjT#o$'f˼+_A@\1 gBifсŮ *YJ)h4Jlfg  CWRR୓hy, 8tQr2-b.yf}16x' z FH_׀|:7W74΍2,6$K̃͆[cc!+UD{cx(xcJ7,31hL#کRk ʅ,a8zn\FĘJ8pLZD1+:T)ua鄥-32S\XVȲFb&.3쭰}ĥz&3ey0TC)@ 6_%7of(~}1L9hc?WJ8,֮X>meZoRШUH5Y4ȘCBKCSBP*:ES 4&:DmAs&kMGy?Kki~knzøm;g_m[g_~=T(8pmxD" ^+(IiS 4y4BnA1 +Pc؀xAc"yk l(4rlx p|$rTY^a,(-),)4  %>H RdWEeP#Z>>i# @߹Y>eF쟦Hnj`CMowHbF-u橢y¤G=9xf^o@S9f$$FJ5Цl;ϣV6#pC դ)VX;dXPCD}QTٍai`Y/ TC}>]h!-q5NP<4sy(]ֲ8R0/N5qP_OVA Gk[`ucVe+3 9o}ϣ<]>db<ͳU/. c KPJt3e gj4OywW$F/g%_n?u}-*#rb~!^\\V]|G b.4NVN2"%McՍ6ifn>w{t}=qSO_u w7SBp*c.;D'%JC4(*Jp`EZ\dP58WAOl^WZ珀R[Jl!%Seײ]e& nfͼr[vGK/_x7reoƟLKQ #]F0ڳˌïœ~3OO#دoOVҵm }{jZen},K-mPnݧz;coo4ms /j}u]HY̧}K7ϣf2}\[wqqv3'z>%<u^1W/@3J@ 7* Huja+}+{Z-(U"01%' !,MY456% O*x'*76"sQ(FYя&KI⒌čS"iƅ [Mr{r_I9` |a,V`J^h-meɑybZ[-R˖-3v]|*lRG8:]|AWw _IPTҚ|ͻG'e+Y^D9ǩ6MZz*IpBz= 4HT"o,8Q$9r칕d\ndOeG9J<&h დ%0  Ѹ1"P14KrKR/_x1B;x>m2& m$ϕf{toB# gD8gZ':;<bR|opm2STLS} LAN1 ,(|D; -^$'QtYw&2Np"65 O2X%83.gu݉/Jg z P'2 5 G8wT.|~L~L73+ i%zW6K>rSe)/.!~zMʋ6u8S1<]Y5+!b[g_{DU@=i7KO뷍{׻oޮyX;o, |3~|i"RqF5yo]lʷ $~F%W=7ޞ;J\Qp+[%6u{'4>nE=[QMS3$ ?r|v+;*I^;BtS㴒tPLKK~߫)Kbz"[:Iazv&X)\hs]"Z^B>r++),x Vr<;tRA%Ups W~BWcFxGutVᇪfܫ\ Q^X޼6\lCFiÙ28;HYJ ;ζ\X8;[sRd8J=R:ڕu< 3o8εa?E;A=;[!&{۟j^T fϗJf$HOk,_)RG\#O$'u((Ly3`%:ԻZk""&Sb~нZlSr-66*xR5g8ޓT%GѨYlOTϾ peb-\o,Lxx' i4QZZ vZa: W> ݩtZ "]DTwJ޽PuBIj!Tu9Nr (k*Ԝ?5vFhq 3zt8k#c6 U0vduԦ,l!em^SL5[ED&"q ZRNαAhC"yBbpL k'E2)ں 1PK͖zgLDe0Qr>gZIA; ;t'f92.zWK7%(p}jRQxZ#EmC]C.:z&t%7dokpT΍RQZ84 S+yJ)"N8S$&9uvAI4Go n`Ƹrrѻt86wfqȚs`kM$ViB%WA`hI?[C#8x8S'U Rhh oY jf_9],Pjy{"z,̏&X:h^zwc#i]GHSW}j08hdՌ]5٦ (ZoVNf(ּ&-ݳ r|קWyA}};3lH/Ϫě6"`3Aw]LhGEgꔰY0Y _'=qtn\_@X_^)1$5 E*xPQ$'1i@\nza!(փEe$NQ$9O5Zks @j;yPk<){ږ{lUZ[i.O39V/yca`}:t+D hD@P[R\޵`*G0$YJŻ$/0-M׷}kBP+R 6מV3W"-je?\p\@}(\[Ҿ/ pz빆91 8ZQyu+)Γ'"(U權(Ebh1o3(A `TPlNlP-O4R5V8dМEyot]H(ErDY!(]6훃V$3Bzoqt̾mi%-hM o.qɫ\n3x*/7^4/N6\ۛE i/ Wp8gcZ$Sz;IMo-^n8zf mAoxƬ(pLh{l!>(1N egޖ ub;U.B?5YC~+|] b\RBi 0E(sEI10(BdFu +s:y;Yzv7{% -l o\>$ LpoGgi綾z8up3i 87 JꕬWfh Vf*kG/L&Sl*ZCƎ*5O8Sv05?`KϥcsӗvçIRrT3Hаm0y`m <ƀ`D랶e:Br7};&()9ʵK_p tE"Y(h;_~|q/(o ?_$D#*zLpύȜ(FrR< $BZ1P=gTm p&e9-gjThk1-ZgCj ?}>C"웫L{Zs,4bc t|4DD) B0pp:5^sD %XFtk%>l̓STLŰ<>]5&f@L9[o.,ٮdJN ۡpiĸώT8 k#(8hFƆT3Z*:49iZ1;#y1Mð i~tp,1 m i)DZ"!9yE?`O TESXr ncK{`c{ت,PHZ9 LK$QR+ˉ,=h" I@dNrJ,#i$5t[k[u'uݘEv\}|8+y|t XԸ9x2\MoOΨp&O$ 𱈄+9߆(b^Cc|AQY\-EMCWYJ:54CwT* Ѩ+Wrz,*Kġ,᝺z H]e?urv4*K+䡫,ҝz ʞXU]Bq9cQWYZy*K;c%+!GP`*hUWcQW(-;|U-J ֜⊣QWYZ]]tE+EyvVOnh%~Y |C/^OK9e|D!񾦏(#lqlYSpN DJ1]'ष`4xMN7ZK dY` Ib)5qDtfB/*|w}5ݡ v>"+_>ͬ*H>y_zOp8۬ݾ?R.?N,^޹%S|"\2#L; .,;eGiZɥDt\BpDH |MN{s'5,l!{t@( _R8j2W77׽c,O#+˯J>7#;῭[JgX0'Կ\rJNiPe[N iB}|m%ƪlg O:tuN%,FLk(8Y _'=Iѝفg1U^fojƚU)1$84M#BϢ*g)1i@q%QQꅵݱVEB_*x1ƑyAV5GG:_{,XYv'A֝Q{; P.е }:tMn@PLCD"F hD@.L@V$7;'AR\޵` IZ=SJQ('YŒ I@7t+ҨGâRXnޭ=);VS _?5/Z!V ;p{Oh8y4F9- Q8MV&} s=koGCX~0.9'A 8 O0Q-HFU _#jd1؜$ֳ)3lnfE]F=7Ϥeo]+ .kAVj4)P00_hQyL2JȄ9L{ҾrǑpsp#b5(=Vij[P!| &$€PUP8%3xC-@āyo %hu'@cB(ɴ KJ-묦_~[% qz+ExNOGзeE# 5)  EhwPbbSKe8yl:oQ1wVjwJ3@QY*g2-Esn {Up&XDf`zʱ?^骴H2ћAQCgsSJ\ lP꼺+bT^oYNCQn:YvRUKr#!4/)DOIxkdt{=mRjnխ%9xl}x! xWe7䍻^"K` 6y&ef{׹mCغn]Tw9U|fwaRؾn[{=/lR07{۸,}GWȳ|GAw~4wf]f܁Ϛn|zK$m>*}cr6K%k ov,mEP!Z/&*Z= 4#*m\".Ě<3+&\YG#M(#˨5JCK-1Bhix&JnG4!rGpd $Ŝ$`J#XޚAؐ_I…lFѐGvC8~s!o%pjx[[;ەz32Òa:& <砒JybrI'D Km"*sP_92vp4Gֳ9_pũ P!*oHST9CԠ\R O."YQmκ#idq݅~;q>3⩞oHBd)&2A[HPHQyI+R#շFKDXWE 4ThCp.$nǵlI^jg_nq6̃G6p6OcZ>67uhx>ơio+O L)HyB jYb3ԇM8ΛɰL8xy`7}$ηd0JL0`~C_VQUŬڃAWCr5ĝmMtدܝ% J?Wyu7ik31h1C65[/;CFCuC0brY3sK6i6Kxr9F(}:U粣Z6Ik뎲s|cQSцmh[Q_<=cxs5yfsm bnLs?GVO/om:[Ui,*'1Ӭyqٻ,܍{& kPgW}W /\,g^^(G݅tV;{^aPe{lˋOj7?#J5{ZJB+_WT, p\ u@{(wN&)!< zO9ϓ7o*@? Gqy,[eh8aǫ+?Í.`RD:u[>aTNV%;l Ѱa}9a%zn?xG %2jGw`SƸl䕑2i(G%8ұ\&MBttls|`ْWC Y_yY|gu.*TTSOiH)8 nA8q!20 L10b #_kpcJZXWcQ^] qmK}ݢ\x δsGQw'68y;JI,cXTiWBq-Dx#Inm)聧tR:C',Ԇt&{VbV@PfD$|'k4n%YzB(^%Jz0b rLRxT,n!4Ҽknd h/x4gҧ45Af3Ėήg6\Q>5ԄL7@}a`c  j 8$ylLɑr[LDŽUZ:ߓ:z8({U8<4"KPX"| ZoY$:لHoٶriԚ:l-rqqs.AE`"?P>v%Q-N$snzH&V^d/N7lJn΃ %A钏W%n4+*:sʍzp5wdYb| A;<% m*Q< " Nk4*aH@q)qCʓ1 @U0Q>h&9BR[QZeI1$,iP!*+PH ςۜhjDH"paR(I 3DR1JSF`'-oGBjGps1 7Zh%Ʃ%'Bx`C./&ec3y!zwKVej AeM[UN*wz_aN7rUekWOk՟Yʲ9m"Ncu^fT~^y7Oa1Epۀvb*wtǫl/zso/wQZ0j\4[J78zʬ|缉u3tv]T-%Dz biQ JTuf0ƗI~e9?[6TTDvXaÎx]w 9LYNXGKeenp],g1'ϥ8%ǖr>vf[WN) *xEjrJpz5UUT["^&\D;~Qgh(9TMFW 6 ~>`5ڕ~\w쳆sfVY@.|~sgv(I][)2d;'%tR%va]c`܅$k v0q n8ۏ_Mw.U@Ig&W&sgLaeduU~=DZRsPn{ItK xۏ 'AuV﮶U[˹ ?>V_F1VSP^FiC}i" IIʆTBSYC9jlTƾ;j:B G(nC!$Uĵ D*3ƒqWB #As6{rΟp#ԸN׺=療#VϘ* }GM5udaK,{9@3У|=u+8O DwNj(*(37 E@M)xU9 ϩ,PIGcBMC6;Qy "$km߲8 [aH$>o5|2T > qj*< JB E "!+)z K:+ϻprweqH6;]. rI,o.ɲ4`[̪ &-Л+n5\BhȖKXډr:}Xz&vAmYL%6r\g>}wfwxDmΖ=/b!pWv03{o:7޽6ξ~>ߊzv~lM Ϳ>ތ9llVziX=l+(tG:[nAEeImkbB}ɨ{QG;mϞhgٗ]Tm>r`8}yڥnTq-SI`S;GʉX1Yӵj2!+,0wZƻmNm:yۇ2^Č8M%٪A]}P Q/,ogM7~57?Ow8mYtu1_'w[k_|տo-!f>\ާ`W+U"]?v2}!˔~ n4G%2xec=Y<4Uq\i"CqbyIyFw3ѥ&cHo#:gp&B"+ӝb.b=^\[_#<*kP=*jR1`Nn_UmMX7UMvpϩusZ7$f/W!sw)UV.x=t>|⌮Vg[TXs@fzN4vO4v䝢/Q(У]-&A*VJtεX\Lbx)lGgej!g󰗃&'t|K*ƫ{B.}\k r\Z_!#p6$I%]2Jr VZBrNb`WXkt(A ?+FACie<&ջo7q_zvunrM%+&,KԕR/`E5qJ&-XHEbQL=}eMڻwk(qXDlCB($ SCnͩWP*kF \B{D5s5b3h(NrUR.guZ\ HqFEs=lto8۽ӓmkoTѱغX YMIba,'o=#SZtiL+e<+%'reN@)P [&P=m $p >#:lR%_RQzIS uj۞$UyIŒ5ʦPm>D:h}O)N16J|]L5VV5P$,UK[Y&bԱKZme565v^R+q21XkAmna[_;Ӌ@w m--~2ǧ|]Ck4X`łbNUl "^}ڌ$Y^/PαXTcvBUE"94*VOrJJ\pŶ1AfW; sr߭co(i+7}(b.dsŤ#1:jB&J*QdK5G{N~/ź'녌.JXlg'=:9pV 3 <8$"laN)I?FQ̕.y>;=>*2^[Mu.o,Lrr`u.wlR(xDߏD.[7ɐhtPz=A3? #=sEW8+Mv#H5d[3OPTTCBtJSg뽁T[#VerRERE\J!bdXD>3dP5 x9l\7qǔ!g$,̹_Ns^9S2r2"[l4 3 D-ZB-<.jJUNVS.1etX1Phs^QPsB-Gb|b..#ξřMOLJz8ӟ\Vb~qmih1}rX+ o~= OW`WT~@$"8Ŕu. 98GV;1@ K9!-LVr+mWdtkβ Ԋ,P2Պ촃Pz#c7qӍtn1a­Dml3quqyty"kW7 /[ł2dYBr.ɂL!X`~!iNYcrQ2V^ :^ ({!"|rfo8BA.(RUԻB/qGx:WḺv7x*jΨm&Ԟ4 5F dtU:!R(,gcu*a·#X 5xұL1G(y!\mVIClĹF sƱ b7x*"BgD 'DKcUy|X]`_F_.lTd!{m+16M (qZMm8sL=JI,irbb]8#E: ..<3.ℋ.ъhshQ]UTL[jI,%QyO>rcDf}hxh;}¶w4({8?b0U~ku$]z%:<;7%?UeYWv>!$[Opdy (j2{_\Hlm(Qql"crc@uQؗ[4RfDD5 4l6X˩0r(*:ԚCPxk@#YH'!PATEHA)݂.%[7`8!*9|UeD.sRݨ 1{4sy9{iصIc؊4jK;Iu(,>oK=L1}-8&mB5ʖ\z|;:\x)d¼ AzƏ?N\~ƏV6o8)ߘߘOz<ںW0fVaiϏ~Kh^P`v;a8S f:~/?O/Ve?*у3MQަ޶Šw~vܟ/OUKu-<ӯ 8y~Y䷢ jgA&t^vV9_j^Zd#bMhy8ћ _񥓟3.8*Az$`|rs}zР /KBڮ&4[|<>; P~[yh&Cjl+0z,K,bvf!:\qnPs2Y3L3ߙ@`3I0-gM7#goD`t`\ M\RZi-رMWZ0HUة&.C+Np H [> j0pE}(pդu8vjRzwW"0# \5q?BJI jWWZ+ \5i}WMJ'zpXs[99=jnGLY8_4Kߴ?\츸y||o2uf7b6ѫ!FOrԐr )xye;x+_u/nD􏲍}32蜢FGBPjZ YUrwmm$ o]} ACy>tZ&$m8=$%oОgÚ& Md@-soW-oS)/c7.ʈ2%ce籧 |~yZ|io`=/ĩ7 Vr`k4\ ,`j=,`])-3‚ڵ`%ݪ> $oZ>$eZP|h=)ۉJ:uKyǔ=汜F.6K =( %g!,3?E>YIhbAKW_NUzkC4Y-ѵ2ɝ[P/yzI8U˗p7P8]uhۑE١Fk,6I5`t~=o6Xߣ/Qv(!AI^9Y"UQȀK+2.A aL*Et~HM)"/LH3e$ #?Hٌ{T=ŹkN'*ti' S1sFe޶mp^ oI2U 맔2fC,$:M9#-A"ތWt7Z9y9ޫ80}}v*LjY - ؔjhp`FASh g6"bՓ#Fd(rR K1dAJ05l[kf׌Jl8cЅ?.\<7Tf'kC!~egw wwㇻ–T(FbA1H M%iMx+ݶF] ?j셤f´jS[!k괊TGiPE sac݌ƣbbEk7cjՠ!8(T.[oYۨdNKQ@XaS'xa+Hj-B.DbdBfd( [QgǺ&e e:J_Bh_΍:grr& 7c5n)".:Tƻ, !R&k:9e+flh맘^Of\r^4wz@ 4`sŁbk@Ȗ<:CQ&^H1)boX}!Ož7_ke:{^oE{I'//X~id,8:d4YʐAޥAA^Aj6i*h@QQ "Wlܙi1l9s_On?/x}7=x1uff}oP:q)Yl@X)!dY[$(I(mJm-%B f/>Ȑ =F .fTK甂hu`*Q4#goY}cqOYْ|bϗlPn)vb,dY:V,_tr[xhlH;/Pt&Eu*-S]v𽳻L.:;s81^VB?C,^{iRQ(0dL ɩ-ŧ`"6dIrFd#?Nh,(ƿ!'%[VN~iZKlwjI)7 WTJn٠'~ "mHsyzaOK_湾1d1Umt(.|ˆTjaQaqr14BsSYT[7Y2(H8~kFhjF֐m |iDyYu89M^CAR#))Brм_d%xkb'Ay/: 8B`d?¡|{T!ȱ36f!E%D)]۞r‡ZVP)HWEr5gdO255!XȻGTEK(]RE,K,E$UG?H`ys) 2#XSwY݁K#t;pڪΠx&w M^ߎdJ]:vJ Ɇ)&(U|GSDBie(2-lƠ5JPqDD+0!-i!s'0([-]/o/xQ] _5EMނ+*('(b.(#O"l#=g9Rs7ՋiUx|cO1ӴЎI֢oU-nd]J;?mԂG"mlوI^YSDר=rP|McԂ7k gs2FtL30K&Ky*vgE]lL|@QБ5`ø͡XSDL7%7R"浬RV=蜡D!r L);zQɒBЈRFa(> dєQ) L:E 䁤KYBj=*9DMXݟbc^-|X<66=A4Cf[+[F[7 1'CGMLw-GC)|>}$(q] L# uΈ"q@IoAIx*׌uK·,%x)CƁ5Y21&x5dRf.E):BZޣ۽;bka^ݦ@ {MzN}.͵<嫻qM]x]3xOHy~wum~s}ohKPH5}wKѻmߓ־sXՂ-d6Xz%`*{ "v 9@v{?6?<nZLQ Չpp0zMy~E',IoP&RTe(dkV)z>C:9;)#FhB>d& 틑LD@T"J"rG9~9ޯf~B? ƻ)sl)? fOBgrxms&<Y{)v'_ (0:KD;*tV S4 jsZPJ 䨜s1)KKBI)I3MAthS2.$DWysZ3 _`rZPAX'Njԏ}ҡm8aU{`Lmy4 OyB^{qVήldm khF c/E$!M8o]$loz3ۯ.=jPZ,37 k/[%{:voJtC[)쩪J|xGBtc ;QuuNɅ5gpTq*"eҸHZ4 n ѠNa5.ɠâkTHc(\BEE9r9[( Js֔b LA ]0}:R4# eȢ+# d N*QgJ1oXR*LE!TPknnմߞZտ!~]nGq|4 K [[tn;o$>nPu;߿< op6ߊO'a邀`:u35D!wU(reڲ1~dy7Q>WEijX>_yj:e3USn?le_:I]9WX_qM-Obed= xߘj.dH#s"}8`P[ T ?t[qUȖ=x~r^[ͬ_{%⠒'tP=v_Xlm쏥ϡx8p'|,zew,lIexH&r'i1wYwfU? ne^bM\i2;s'zew=WW~^y8kW^&=ݝx|χ<+ú=W\xY|t㕻5oq顝wQz^=}_o.)A=AV0C`<1Sm'5mwS"'pFO|;]'$ɿR~ngƺ9/{w!+r/^ 44Mvy̘ʌʌGBPłBvX>8⯦eo2Q755*y@L1LPq{4Rr9T/VZW8XPvM)vwK &{[PΟϑNGu2f`^r&5RI.i22$9ps8g 8xv)˱`y4 V+'NؐqD8h. *{vWspvm;yFK;0ulxzK3; Wtݨr{u{Qř>2I|>at{5Xb]n^7,L R @50䉓 Vl.d>>=_hJ5!^VjXL LqwdCHb@pT7 ޔ?OIu/]~^uIlx:Od4x{t{o4̋(Sp+"T[hEi PSbP/*U[:`Og Z4t{u F%&}='ŌL*3])jHaƧԤxh{3/?^S5t[Q:~i·?) neNꗆ WǗdQ]yjf@OɖwkoS5^uvJX1H/–^! zWWt4M ڲw?'l9jܹQsQDsi: 4.q>10qqv.t%ގlJo *wé 1; @!@~@ D*U3n-WVR!qad*"i)e/ya`"Z`РmPV# $@ i9I1gZB[~!4i7ydU_ `KO]{vYVz^y80H1Cܑ$&Y엷w6Ŧmlf 4!dyLh X0r[жD= d36 V$hMdpNHcGja/ũkLcP4TJFHȈ&*HΑȁi*!lh2ΖtVuv4Q6;5>-mA Ji2 ^v\JLSI0= I Wz >rnbAR#"JIHqIc $Zi`o)Z(\/j&jדz=!n)鸎qFiCt&Dg8-aZgZf sq8:uNx[>"V۴3ҩU *SyO̬fZ'Zr!9+9s1Aڳliȓ/ ,+J|e⫣;Y*q{I3T8ϩd"U\.FNcI0ՓG{cM.PGk+uFT\)E6ɀ&yѤPҍ( Nj+9Hx&Ur Ӄ˖P|0+rĝ̙0"w#-6'{{R〲,?rRl2Z -ugI@At`5HF-VCRb)0qZFe)A@~  zftX?)R鼲drr4oֿQXj60 @uLr7Nlí!ȝ 3!U2?(Kj#!e^@ԼZg#WsV`Xjǣl2 ',X4M=_gwSB\Z@:Y%>;[8MM~˰L7NÿrZKVel[q*6t9{gf D2(4M?Wg[&Z}lΛl CW/]񱀈ŗRפ-ԹL7/aFTgmC6_oT)2z7Z~tGKtt6 ̾\w&Gau ef0Y@At+;OmW`K'(яzgXj`,}-C[ z9a|yiܰ^́g(hyVK%FIJIL߇ va\DVƪ-)@Z?*IJyd/o@Yҹ/RHKarj)umNm-%_L|?jljl+%5:P,>u+{҃v$%8Ȥs>R`K:+"a:m& 멳hM9qQ0c1,wX:VN:N|'N3}C>:"VND_ljHu'k%/FfrɴB0,R\89Q*G䬱S 6;0Ͽ9ߦti`ן-|qt[eK'sbۮStuuu:8?%ǑJίdiAS-Aig)IsZ`_H@t}}!XЇ_߬jO<_i|J1yA!4~i%S9H [ M(+)}֊0RWe|[:2sfջ^wJ+}ݵ 4:(ƙղERª< r&imP<9بN*Ө**휊J;nAhdIwX@U4˨T! 6F + J$!'UVGCJ9ō8`1wat hY*bA9A [n[~]*4xV=NuV}S}hq][uDi/ȶQ"Av])5!W۔WG0Byn>Mr?KB$ ?L}Zp 9¸eBC\jN%^9sZ8 w.ޅӻpzN]} w.ÿpzN]8 w.ޅӻpzN]8 w.ޅӻ,}N]8 w.?nL%nw/*l=kWq~AMၱXiT-TV0bR2rPQpΎ@ Ai" f$< GP #FM⤣eM?8{P1H=By)54gG^]=_8&/caeqr,8* 41CjhJq 1sAg@I<CWTbrH8+*(黢v`={v<mLHVQRƌƁZ1b"iZ+I/6ymqf(yNz& ^>5?_ bp鴏m14P p6*L~6I .$=B'= &=L)U_MirWf f<*69R%7iQh 5T(O+L7n(ŏXO#/o#HpGQ3=k ;8G+RiEbTcǍ)WYZj-mj#sGWcų+K9s}'=F9hgmq_6 3bRb*WT"6bApӻD`>5GX&FhLq:RJf) 6q6k vuW~/ɇkGBLJv%7;&S2)8h-B}P[lBV͢-zNڻ6T^mxhc۫YCkG<Ù:6e`8*TtŔ#QVC4JQS<#Uw9WA U  W偲w((3L iI@EAzl*Ҵ0c֡EDEAȬP`B<UXzg A2bA>E55[!-a6qr#%Bp$1A]#Hm.HߟJLKXys=U!BwjdGWn2y?x{]d:I'mt>svRgaO49.kzz>3>fid+AkM\Bެ?DKr>Hcl;x`N},ބ)yb20ss>aup_!{JCn'pavGmp |V1fNZ#cZ οd2Q'PzvE9yw~a,f75zQ̒5ٽ9Uyg7xgkp9nﯡ^Ive7K!!meizE7>Bnm>խ5$.[N~PnW#b6o(B ZZWM㼚tfnuвU;Zvƻ;oy0Lx{{y0ܚ7&nk/̆6!c.B?}>ZB puH o0eVڔ$hVNAؐqD8h. *KZ+m,sN7/WqH]W@olؽY\^qQ$*"}"}uDŽ4M$Xr<*&J1ooVuhAg9CCC(g7B|2^aēqg߹ǐb﬐[d` k"Њ9,k \9,?:3jl2Sʝ]1؅.'&tpu=0G?h|v緵2tnHw'o>'?věZ=lm~n|~ 3%1{]+m^l^/=XY/i^sA;=}@MVx7eJF<U݉.\>qPΙwn|u\ыmt}1nif=L#-wB՛Lޖk+#֝\ :sʑck ۆGͨ@6lLmȁ?hFlC0D~@ H"Rh*++8P0`ҳP$tBI  m!M$!%I˙eFHs&3&nlh<֚<*)/oXKO\s~B+= pC.$u&Yϗw&eG)aGUL[mbU }$*2,"a- ab1FEsinsh3ipѳS aƦc+K&2K8'VXtFilq.  D$2 RjDs$r0*tJ(2fֳlg^}Dݭn)ti/)w4!98 $ {iiJ` 5 @NPax[M6Hi+RR5R\VxR@ðPȹA-?IKg{)V:DgbPJIs5yf a8'(.#^1qμ!n4 3:!Ae*ᩀ'zʼn\#tJN\jLPh[zK/z6DU>Y}x'1 ۦ/8E_z6)/q_On$02h,%:6z?={I+ȑGCs0) ЋZ2(H2YFRGl6,/Nfa/J9ųVݧmm9W󐗏KcK0ԑlCWmy_/Wٯj|1h:G磫݋a8gq=weۮ&׿4i13OݙokhluNg`g]qz'2XAڔj:6WL{)cX|1C 80pO }T W9kd\sb^y╽iЦ3'e" ) I*BdN@v#̞~/wmˍWݦYň}%=aK~%ou p&@b. AAtZ!H:}2++H2YS2B:&QC)'kV(:ՙxdF),C]] *ɻ+ QC%r2t3;O. o6o~>9'{6;BM{zkKըlXRzWOJ?0u#bFz"j6s4og<ӸԙnbѬɩ.ja`"k-,L|*M,%E0*/)ؽÍSyz_4M.'EC/7߻I_)2J='0IڅZd9Vd- !.6ElAO18SX$ATKSة U~ ƽ.?t _CIwy6YuxO3ǟAnY?3~OfَCzFs˩}#8x|pNB[4 E? ڢ"gR J$m @\Hxj҈qv{U3~i/ߡ[k9]*E?QНl>g AP rN)ȌO.I|-:g@QS*Ϟ0cfvym0x6O' ue' b㌐f}t8 S™ (.f,g2fH:]9&` /QFm,XE-czUL3;9sǥ{qC(jގ˝mrVYޖԵ}ˍi﨟Cy)+?]( R.+#%r)9! &J@_ " ْ:A"Q*Z*x]sRۜ4K T;vf##z_Vwo[OF_~Xr]o2c S{P^:yAFP4*Eai"BM2+_鎋y.P= LJh HVdf U %"ytiYr1ڝYǡq;Fcb-C *MTOKQ$`W%2E$%VXk$`Ζ&eʊa/ &٩w"38OaoMVb #vf2Ȉ##>0͗"%a1, ~uJkQ'+U2Ś 6=Ae8H QHkDWgA3!HM6L1'joY2]yoel֋OάP^yF^yшVLFMXU,Â(RdOMQޒ$_5y؛xgWq(b|(-ԎN|>#6m>WHO+i(=Se?Qv@b0+\-ҭ=voɗwk(Iϰ[#(h^''5k]AֶE08bچX69Y%XYg-z-{-{%V3dBѳC ;z!`=R+*^Ljƛ5oD:zŻFﻎW6G׽W 䕒Ȁ¸av>ڠ(IDA'4鄑59! *otȋNb 3Iig CfJ``vQV)F{^T%|; z,Uّp`yWUyįa' P91W-hzG-kNH!6Nh &Teg0g_bNv8-@R8$XQ:"Y2!&ԐEqɃQXmU0UsYɘ()g"Fy8 0XZHvg/N7iE}*k)(kmE2enVy.~=muc7^>N,``ox GB&]$z8 I /'C.!Z&DZndQ4X5Ё\IAk&eβX :Y6\+?+%ERq.:Bh~^dś"IyROrNxb: @T&-  RGfcgL£%>u $BS9IZ>uF0s@=m䧌#RE!.T!Y T C D^o6qh-r'GFwv,R[U Á' wXU7ssWi\nnS] T649ih< MΒ45cbqUmOFtph>R\srGK4-͢o@\Ҏ}r_gtaQdщ]nTi}/]!hRpSŽ$7[]=\{yLЎ.̉\+Qq2kUv]FJΰ+d"72⌍@qdHoNtd8Yd2:S'gm6>:Dp< u3 ,L)HJ5v;{Ϧ_]Woiښ%))IKhIYbvX IG٢1N"2&腐%;˜@Eʢ]D(b)Vgp@|kh |V2 A7^hd}Ph ʡAڎ;=eXzMmғ5YID-UkP_))mHV tD1W` M6Ū/Ս:3Jr |{,<3NCRL͝ PjTFNEias5wf@tŀQõUEtUQZ760 b( `Uz(tUBUEz3+RUp0tUI ]U{"EjLHWCWNK)=pq0D骢tbX.5Z\ASg}3ў(v~^2Nneyw?.Y1vS,Bx|$oeI>)z?X2zd (\08tmUβ̀N jibWw+JbkP`oÀQ`T}Fi XϽjNK& ]!CW}HWgHWΪ!9]1\4hQ*J;#]Ҁ#õBW#/+FI8:HWj@tŀ]1\-jh VFtutE<+ *\BWS^NWwϑz`8tU;1bJP誢U(GjJ9Br90><$NKWB+OrPo/J>HWNBlUJ= Z]X"rH(w@}}[)I z04]*9h+JP#M!M׎=(k< ]U}Ҙΐ ŀk-CW.PBR HaDW]UBW뽺bVΐЊ!9Nuj(th~tUQ-cW+CaH`l`ue zOWV~?Zyq?Rڌj>yW >%/'i19iVOwnq+ G˴7=9'dž~ڙ\wdS"M#ŮfI]PΆx:_D]v_OUGX ߵC>Co_7gv|o“xsp{*k w!~T`r-&o#m:p9Y yTlM*kS~yy8A >!l|bw;@Kb9ݹU[aOw[/zsq/wOL &{s|.X?:C7z)G$>QLW)3{;Kvk4etbsK^>;fn [9',/0a맘7 f|}i>wлH;[}.[t\b޵[Ossuu)U*Ie{򳗓;{OOIj=ڰi*ya׆ g/z'p6.Txcunv|MNvY}wuNcmhkxRڝ+-}K)hi˄!$N5N_ c@V>oZn=&yқ($^4yN\0-}:~nuW6{{l΅ ^n힊@CT49'GY _{<+qGqVB:C.;Jl|Y;{ ڇճ 8.S[e0'(DQNPZ20O= qp%r9+Qq%*_qub8k&'f\\F t\AeTaʒܕ5ø+Qq%*z9iI7 &(-Wү9kc \Ap~\\FwW8ZړV\}\ Jqn]ܺ$jZ׮W(E#]0BQ{k182?I0Sq4Mw6)TjZ\W={7^7)>GaLP`Z? Emc*iiE鯂i͆WlW"WQp%jY:D+W#8JaFZ|,*Y:C\*p0\(N-WrgkW+km\S֮DJԲ[:D[qur6bX>DgJapZ-~]T:C\yG+FF%^:D6+W`z \A0tfPWVJTBsUtW"8vZ syJTWW+mOIW[&O|4D4qa W=bpaWӆhzmCgG5al'mWPl&4edyv:Kkpu8Glp(sXvqC ^H4҂a{MQ+ Y҂\WYZв#GWN\irO\IjKǕz+ø+Qqn9⊃q•>=azxw%*y9gWap%r]WP-WWg+d~ \Ap< D.Qp%jݕR9GuW"wNW,]J^/:>G\hiJqܕȍP˚+Qj*HtAدo.\\FKT.Q+^+{W1>IO] NkOij.d2. Wv슫_z܅roȿ9c[{v)&عa0-ri.ӢriǮ*6*Xo•QW"kvj햎+QS + # ]W5+Qkq%*Z#,EZk=D]ڨ+4V\!gJz/ij׮rW+YnJapVq%*ץU*W"ةap%r93K*r\Jqv•̠ ì]AJT\q5:܁C^)T8$/&OQV];MY\W=;w-X6 o󝔥~Ƿhx3+(۫eR HӢq>81;vu}p8ΟƆav]^-ͺ0*M5udڅgJ^@K "8a aҗD .-h6Vp%O~f\(vA:C\qJqܕWf\Zm+Qi슫sG%aR36 RykKi1Ht]Nw姇^A!߽O?_wA^?}DG{"?}.t3lz/򪤻/(??((N۾]ݤ ޙ[wۦ=hVlJשmGFGBا5dJ?~SpN0_?a>5⇑ھ~6?OdwīE PT1/U~c=p 5x}4N5f|f0?=b?oߡ|or+7Pnۦ_^_/Jm [~Ϸ?%˱%͡6F.,UΨ(GB"W(ƟBr{}oow4a#7ooo_ݴS=|٠A.'{Tjz!U]}0˖p#6jTm :fuN)4j̞Ѕ1*UN[m}ʹR/ݫR9o0vu6YmHK>.4s7ʾuU*buXGxF;t*dՂ6'Fj9xnN"TrZ(\ #s#f;^)Ifv{բk@.Wj_,nmYUDs6H&k2mmĩ)cĜDh{XzNBuصvh춸1kcE{))wHx& ̑^ܫ=M&fJmwxBѤ+C)玡af!M~Fw!0ИU֤{{6;[_кGQHx0b|$ߥnwח?6YR!]AK{q9YrX\3O/Ss.ͽHfU!ZtM%%pN5bUm9@GI+*Hx1ad;z7E"b'QjShі8:FOk1M€>>9* !֪3lm6m'r05{b>#3k^,XU!kՕkʧMpS7B 37%bfC)^ 0kϒB4BF( ٥fiLZ%|W8_`'6XvA9Ų- OY`b=xSZø_ux;g~樌5 ('έZ<ĪuUJ,2hAgC3cMm̭Lt%@1eC;u{EKlZ(BdXe=Ҽ1 k9+fsE(kGU8(NU$ԓ.eX|.SlG?[lelQH%RĊ dC2!ս4p.LqNx\R}S ]pNj*,_l:!K&@37-V8!خh,aLaPwVrE 78c)(|kk`$4A mHcgrcEBy>uFݚrE,Y' sG`a.nKPI!0,'T 0jXrpfdb'#\+4\CAV)8J8I9nXT5ϳ,+xGD)J6􅲎()g^\F*^fY=kɬ`J ѧVKrz`3/z˙쵈Vv 49&dY]D dR{4{xWP>V|p.#h~#:38o3aPndrAc/fĥ*ʊYdbL@Tؼ*b0P /~Ns/6ۻveM~+7C+ՂYe[w!ZI|txK0up~(6¥ϛБUV%S0+J2a1% ph^[N-D\8(Z V<@E&rZed^]0P>x!kn #.c4/dx $>yt_ 5P2&ݚ"[`qgsd~9Nv#kP[DbYX{paPMRuY+ QH/VGlM~suyTdT>B",'K 6"hwpSvA$Y 4%:]%/sN`tQR"Hc*2ψExhMf anK)elJRJcEX A "+P(ve1m!Y=MH#{t-٢/Vr$ "25;5vEY!3S*JJ+12~؃Ra*@PqY]Qõ3bQy0 !SWI|j8ˬB A1-'J[b3bIVHgYtgOGFЀʬ$1-JkNVNUQEx7k` 2͠?G3d ĐZImmzqG M~.7Ӵo޷G:L`ԭGwX7ۭ6޵q,ٿBm=R$ K~FiRd+[5|EY(i&at骚:=p`3 >z`2oltFLvWwO(v`)jXfaԡ׺m I<5% > 6g@5n0)!VMC|̰Cj L {ճ\C2kâHYi\8!SZ])nO @jW gF3byPI D 4.еwHasP*|57Pٗ9*\d! ;,Nae%3:>5@\>b>%7,!J&3GA$%\g_D p7^=@p!`~7X6rIWŪt&زK1MZ5EDvSXЋܸ4pMQ~5 HTv#DoaL1UGUBu-nNRe-̅gF4QmF^]Ap_>rw^چ9oUk},m m%P-_\#*#TX'5֮?J klo@+ H D<]!Jo\y-7X ZyBF]!]9ew 7tpuo :uB]!]y,=+yo誅BWsuB[E]=ޞVt`tܗ*BtʭC+]Y=,B_;["\)< 6 auʔ_oՙ5&u$jP!d=MK31 3nؐ!7'* ͧom=i̕ M#\FOm i@)#>B]!\#BW։F]!]IwOp}o :NWRY#+eeGt ]!֚tDWHWrt SWu}+LDWHWj+ KazCWWپUuDWGHW7S{CW2BR;#+}+] :'7}+D| QZAtutp+u_;xeתAWCN\|5^3DWw~p 2VVr?]8AWCϹr[eL fL ?a)iHtW5oiD/qQ֚`&9N?s|[xN`|Bq0K{uzOt\NWZj8Iiߛ_=cbkϾ؇1ue1yEFǏţz]ZCGt7sr4_/Sbv>{ǯ6}@į[1-Cl_M.p~[mTfV+g6˛{kQ*ŦpUo7TV"erKY/4;Ezj.dcʍmf66%kӨU,,y.X4[(i8@/Ajb)5!7j$D{5vki'Y|5RبMCxjNB)cIۣvI-\cl_R ֻR GZ#$+X_ gBDWGHW'^U/th:]!ʮm!Nt,t/gҪGt7tp5 ]!ZNWI9ҕan$uo jB#rf:Pt`+Eo ǻB/~(%:Br\I{DWX`~?JU]+@ ҕ>=GvzSȀh:]!JBt9~,|uW{rW+d{Е'zsVڲs. n_3|KҵG\սiU_hcݩ:M#Je vY ?Dp- J_Y2;uY2*="ݫ{Fp{{<(XoL׬487h9G|xD?Q{:zVn9h^OmCe{ lνaLTfltt͒ń%2m >&ɘv&mrBF9Di_ }kyir/+/ xgv\(5_*^޸MͶ^_1ʎԦV9Aeҁs'8wˢ :O+ Wǽs2TxY9?9 X2Q/f|1fLt*up9\X99}'H; M5YđO#a=>n#wxaE% LpuYjD5 *6,O1Yf^iMc47itxvs 69|~̪x?+Epᙪ&Tm#Id&V&m#5z᥍YF8, VFB- lxmE}M3 4-,w~ v`nߊt}8UO^ߗm؊mV :im/wҶP~fw;S3@WObzff? e*k0Mb6+y8X%mR1oijz L6\&d]0nb8x & -s0(. ?|譒0}7X|Zۭ5WOn |zf,a<=,m#~&T?' 5BP>"'4{}6ڝyY8.} ){L*3sϴ˓ܙj3/s +N >gIpۧR;ǛY ͦ~۽_?Ox=QY!RD%]R*]UKz#G23*;}[}ox=9D7Fk(+ljmh7,'|aXۧy,s 0;t^Ҭ,7K҅TBJ۹cp`j510)Cq.FLඖP ɸ?Ջ(ty!Zd|%ѧ!v&9w܂>pPf elR2!Lmk¬b:;/tMѨD":M# ,61DѰ ?#Qy cn c`ʭ#GU,Y┷`'9ԘN#{K#x84 [2 XkB"Y($N*Bt>߱-q8/as p󩜫zȓ]e 됂Ud'\I b&edԐmDPG2k eH~`f's%X6`u^]e`uN^; Hu|Cl75!~Bo&;Dk&ᠻΩ +Mξܿ}k[t8oƓpIkYr wzAi}ok"76rh[¿.&m=Z:j]V7wf׵G3w-KhYܺ}Nw^onB0 6ݜ 87ye2?㌈L A Ƽa`&(Nc8rB)Jb{*ۭd L΃lOK5 ߴl?t9^]{ld(~e'_#EB{OKI\a/YBjg9:%4?aH*tO/N]?(br#\i' ƝA`r$ GivN 좠Ykַ {4cEnuoYw@)J_*E"G-BWwLXJZ aDI@.-ʃm W3M.AU=GZ-e VXjϩyZ<\7bqSRϬ9z'*OF(JD:'u:-s!!0oMl}ebnka8$c>du6l^jC'nԸ?ǐb﬐[d k"Њ9,t\W?[GY6L6vM4IgByq4.|t ,OTgSc-ǪdRؒ`J(S3@;$ OH0)B39apNɸsv^_K??-4B$vt:Ln;9=J7;ѣwft&'T(T|>Jx%tp8{h: { ,;Ǔy'3vIX x7^,_ Yn J4}iǘd8A?20gZL"8~Tot뺛&')GbOzAb}&V}ov#ћۻ,L݇ u pCH$T ǭJ*$T3,VP$tBI  m!M$!%I˙eFHLFhko&G7k ayX_(_KOv] kvUz^x&yIs1P$(IruGiQuML'' ;b:lS%^Ő'$yTǀa kI 1:Hnk9-IAΎ2Xx+K&2K8'VX tF|50Ե1(* %B# RYHdDԈ HTLi*!lhڲFΖr?$f;FM/ K[UoBy@B@3#8@BW ` )$)zrnbAS#"JIHqIc $Zi`o)\[(jrO:I3->%3(R8tĠ 7Lk=RKLqNP|P(k'v-oax~ xef*xOfZ'Zr!9+9s1A냴ڦP&, Lx_5Upd oSw³tďE%Pe^(HGkt|AtR?$ PҵlH$IH8M lBA)ؤNz}ϝ'o F>@N)my/_%ɫ)g3~;My]?j0yZ?9~?DGE 雟GOBms8*"S&[ h,gy^"g>z)#6^t*4ˋrvh<LzpoM%Vk&jgrԤ½P`H9m]R~U/N}74-6<3ض8MQLG&}Qu>]:l֒8?CeXBא: A/S1ul\M0:^^cDɒY Qa`^5/v)',/8*@r,xḰ;"$OVd VtH"\JlB<20pZFe)A@} Ggm;me!6ݧ!nkeG3EAq]7z# M"psiB4o3=¬()AR /_uPɭU|Vay]_)9IE$s2Ja1Vc  B' Lܟ6=TIpLJa$H\=Ba#!cH@ C*`h$ T_fiJ/_j6rXsbJ=#Fd5oAjWv٢(xE!܂N$VI@Ѐ -bbu$oIAʎB68=eRT ) R/ݾBb*=+3V`0L%fO?ެy1ij`z.ix^V^Fw^tiښҙMD|7JFOw')^ _-QK$$#XQ>oןfφp1N)}=Fi )Gݻ)9`)\/bomG=] q+i꼨r:4,6RezQ),D>^%kV9q:/,v^'0ޥt@U_ _ x-i͗v7wX@K)K싗֊-9OeCR#FT:Bۦчps/ҝ}Wo ¬:`(9j&Yo_(y]׮b[@Qix`,(U2xi23ŃGhοLGq=g#2fKʖؑոm,dȲj= S/[5C©MA1/@)&*a *'OC!5ňDQVmN +t@HmHf o80 M ̅ەKǫ~風#*Vu1t9tx:)d,A/4QRYK-4q]^+Ɛe`*1tie$XLjAXIjGhB9q֨؝\~^jvgk|[tH<e?ilnTLryå 3jg>LLj9+j$~PxTe|6xٻ6rWT~-hfKR*%]IoU{$S)P*9$p3v,\ӓǯ/9% W߯~')^'=z&]g}Y{k`GϏWg˥ĻV+;#atdߓE,,N?n?1|a[4Qq\R 1ktfBiL&TCH#&LF\Q-ۖѣѫȧVS6p}U1 1 CaeaT*Q#W œE, *?p48L! Wח_NKH;px֒6f׼qw/ n>FCfSLkm ,3Zr#&c \-ؖEy6ʨut~|D]GݺOpў[f@+:U[iU 54iꍫ%Wa*"pUgÅHyH;[Jv= #[{@lS\&dC c"ëx^=T6T7Yv?A`u\7zrΕ4oax Ði5iu筂}r= Z ]^ |&AZ)*^$2ڡe %˝eu{Cp2vBPOBb`Z_@<-9.sТ`uմ5Xg+Ĺg2 볿@|ڋ}}؏[R^t1^\=4r?ەxZMdư2lB2ک aVWM@番<1n;䵩xCA bakvWKBk(q5C=//o뺮uۄTϛkl-]1[,I~~p_˚GKm*L0oq6f/&FM\:=#j:a9uH6&Dg|F  *sL@%I?GSZ*Lx.#rn{b'ls\Β0 syC{x8(ǛT bP#XӆFa5L 'U(vZNj&1e}؊SUaRDb!2$Vtbkr.G(y \MVIb؝F2.ˆmGfDqfV| *@EQtإ"l,%j,V,1rRlIlbl|,Am<sL KI,i 8n2E.u^6ה|myFEyqg=ZL4JK ֣7XJet#w:C32pPݼfכiq|Efċ4eᗓe?L56?utD"c/MGQ8#jyn:rMG{+lꆮ\z+AdNW %tutaUV^誡E7uj(Wtu4t]5vCW wЕ0yj(g+RN|h VW s7DkjCvZ ؙZbV >y["&uɯ+EXayg7Bt{?';]UJksD ?pFh6Ci&jЕjKuDW nJ+BW -骡\ftuDtWQM~=SZ?mPvP?5nhM%OYMX5 ;UC ~tPܱf:"mpGt%n\׍&o7dg:B2ڣI] ` ]5n Z&J# HW :+`04,'sg;ђ7wfuj<3]Ygg|M.Wz| 'QGdkEՋk%;uBC0еMGt%P=.7Btut{ϊ`^JвӕjVWHWdۦmJWi솮\솮Z1PN%LWoBW8#2ƫ#腮-ɫҨrOt%\Ӎ瑱4uj(l #] W)X#n5{;]5@3]!]90P?tnj$ZNWrALWCW)= nЕu]50t-(Q&[Rw&T{g @{2##rv?_w~q/w"s8z} S??߻r#MxJ5!x&}u/ǯOFlkEےBpK)`C̞RI!Bpĸ i3t5*mV?iKtbd3t5yhx':BRRWyp#mV ʵ=DW/BWLV7DW_7nt[+&g@Nf*] +UU|-@OC"i4[Zt< ] ~<=]#]X{+ŷ{$ZrkOtut̆jAe1f3t5@I]_`E\}fV>BypZA=T<y]񉮞;֪w@;o^Y7>Ve齁f;p-Ѵ MK@!h\BWߺx}?] bOtutE1:%3Հ˛ho  X7d [V޿r㡫 ějQWn܌ojDWGHWQ\] 7CWlfhjtc+QodKSl)l[+uvsW':R/[2La;sWV h7J''UxЇ?> p6򁗉C+ڨdJ1+[wfU8ճBYX7־f䉓hjw\1]1zYqV6Nx ӓŜf9adN[%mfY#s| m5ciN Z^P AdCS ~f@[Zhլ}j(N-x+ۭ@VOW]!]m`+} v]Zk2XL-FX1+~tWS$NܣSA,|7#cvD81?h1fr'4mc.n|w۠;ܿ?\i?A?jD }&QIgdcz?0~󋒮S>o_Q>~bj|L绫K݁1x ~o!ޑ!nϕ>rv,ީbN*[zwiMo;dqiͫBH?w%oTջO܋ߞ\Q*DnuyQ>o?f|d0?} BGCs)@RMR̥fCl5ٲPR")7KU-pO=!P#1ݹ`ИKP^3Y*TL%dB1#m̤V'lkD@{06W}'M":K1T`JH. bj{ŀ6.>CX ua5J$y*M6% {`#Z3<%5'3i g>PL-}:oNB#NTRقLC))V^@r]mN"cggUâ;z7ihj-b'<%i%IB;v*%$XrDQ;S=EKBh jsѺkj*ME>`uR1-CrLLSjcc@$XXgݡ8TGhO‚XKͶn+u&EˌBi~ 0%0`)fgb{uPQڠl T9|xCߦ#5>.%Jr XC] HmmXBՍPX$^.CqhИ05k.B-4oR z>qሢlTZ{fDUhWEpջV(V*RO)^J\lG?UtpuTK`ؒdTlP` \dG%G"rflg(MW%Ce#WH.X(ex% ;hE5#'y-6L σ7ę$2&e#kq3}j:LMdM1c4 !JP bw pDUxV},EeH$YY& JN CkaOt{3, MLk$,ά!mgXP)JVMeOTB[6HjZ` pp_l^U*qakJ ׷lf` |3߁ušK٤Z% Z4rO]gQ^1b^X(q¹a <܈ᣆB=$OWuK[s uH]qTgx hN9hc M+qzPڬ=Ck&-DbRK3B$9E;ݳ[e砤Dݘ>@MWfUmZ k@2 @PD]4j5 0h| =+ׅiqf0pq۳t5_a=!,2\b$ɁnM{…0: 6Ec6ՔWNR X c64&dz # /]q$d Jm <B7и!wT%D0B @AAxk,Z|qq}f-$syG=]&*;j/K~9}? rǝ ;jQvjrU|z=ٻ޸$WEL_0,`nmls4ģefd[_E4ciEz*6l6~Xլӧ7Gk@sxR,ISA2I;3?t2yW,t[))z `@DofUǠx=͡|6893oV8?l5GvutX_>5&u .-¸} M[aܔׄzhsP?q;|/PUx7ώW(UWyfQ LJH DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@f@ĔH }@]_@VwxR H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@:\%DI p?J eo@֙+@7yR@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)W d-#]!\ӛ"NWl!wz{̸5^>9]jt\nhyW;[tew+KtuߡkqX^gAD3Z\_KS.1AlvPh iނ,Z8]P-L1ϊd(7G08kt]p1rܱdZYL:#L74p M#^ax8Mc`DHB(!U ˯x_tp]+Dҕ|?tpzn7^t sGt]!\kBWV1uBB] ]i)%=+Lo *BWVu޻B-pHzEWXսYjGpBC+h}g 68jnH<>x1LuO;]X/0J|:{sTA+Yc'I{ʜ)U]2X.ʊy.XS^$xn[Nq\;x߻+=j]i.F`'_#[͢;1ϋy1P:Fk9=+ fo Vu~QjCtutZvypuc/4>o3<{zj7'N;-gOn(eǼ+]9=8Z%1\KLo6sFZ9vivKZ,.QP֘{1 TM>6s*c_/Ƹ~{5_do,7agQ^Cw [)Ҿ|ޙGj5_aiގ'0^Ј9+g`1r߻Q|z>.䓷 *x V1y6HsMoeֿ] .48S <+fr[-7~\7{??uV2sU*2_Ve-pV%*Fk}\d DUE$K0 :{[IA ׮No42t|9Y5+U7.cNQ?G{%YZ@eiJD--҂0#BV ]Zy QvM"FtUJM^H˘wp ]!ZNWҒwut,D} 37tp= ])+|0(':@rzDW]Z!D Q*Ftute3F+zCWt~ QZEtute#'UDWHWYDW+BFW]+_ ]y9=+,Ȁp ]!ZNWշCWCWs|k_=]IJ׆8cSWn5鰜 NNRBE-\1[X[mɻ9Ursxp}EKS0eO&|Y cv9l4@`r*CUyT|,%dxe2s0>1 E qh>//Rjsb2Cќa+gegW8K/cxv^X^['E;6[˶ u='6vvr}3Yxb\5cK7a ){xtst6|>? ՚t2|.ý80ǜo(n ,D̈:hYJ&k0@WY)Ty+\L"0NUr63*7f5+f`TPǬkfD9 (y̪:9+*k{{sDo1|, Oj82N/8?*$DcMB-d^g/ZdG6SN/'mTzm t7A\[jHk2[6=)+?݅aN, .:'1NsHV&k!V#)3 (IE#pрBjX:̇{s;6A}N)#1vfeDgFĈĈ_*5&TmZ/T6F[Ld 5Y٪2!8WF3f8k!6mgTEɆ[]I ee{fĽ?f xqV.'W:f%Eg^Tċċ_\ѪNCM,b-^Z!Rwֈ`'(X)!cbg;eCg>Gu6fy>k5钍90WYt;Ю%s-z77ࠣoO/0"ODiNqc0)CN'wEӶ~ZB8k0XUvֲbP˂NL].1?|S@P`Ͱh֮ a}Ζ5PPȄe˽/xFChahQ75YC2W 9/O-1S`rZ.=\;ܗ'ei5O:RYkKZdW5֔yisiCiiBJ`COJ.R e`֙WA UկJT2JXkc%wUʡK%LA%f"s!ٯ;ANer|GxrO n6il|Cq\ 46gT_12,XVt!)!#J&&b3u3Vka~#蹮+WoxxRF >]fTs MG#L:}CgF?KoO'y%\o>9S,"qEOYB;3µ/gDMπRwms?`?zqv/OC|X U0/>lҍ5s~NF<-l'[=AȏjUs;~?|*px̬ x7Wjx]yB>?l kP|Gqk&n\z#f4~/x}uq;رuNKk<*>᫻׹ɛuB\e"4Gm+"TdIٜN{Ť%jPf d JH) 1nwZ: <M͖ӓm'ےf{m]}m JqWN*ٖxNeZےdY#sџ Sii7OP Hxx݊×lϞ|xGc Wu]eYW#ԕB N+|thK]WDDuцӓA6:Zeh]BJ'du?:88> 7j21+X30 R]ET{8Z8JT;D z:MF">1 uErz2J ?M/xOg[Z͈r+'MKJ`hpei"uMeY#4*A0;>J'ҕ#T"JiF+Rx 8p0\tE2A0Q<6QN+6"\FWDBBJ?xʔu5]Yc+NsWH`p+5?%2g]mGWHխǛ j'\hU+.jJi pltEhN]WDiuU0AFWgy,jH]WDirJz9*8* o>QHJVra##t%zP ߥ~fyT`\b΀ W%F'.BsI'#<ʅgx4~TAi' 9,QU!*/_j$!6`3F\#h>& !Vx1D AT}̂$5ځQn Hl`iµ6$?cﳦǣiEgjFB`FW4]U(ʺpzΌNY6"f^uE%hɺڊ7*|RQ8f+CW`,hMH]WD鳮ƨ+@NB`ױZSQar+ڱ:6AURQvQVtCUBș1L<:61 lb҄+=KIDisrV+7pjJC0Z1[mp+u");!j2RTWc+u6RɬJ kgp!pj2]RWNANI!$]d3$ZRQFJ"G8ĖBgς&BbfmwF; R4jEDk %Ԏ͚ފ,#]plt FWDkl"NaʺڊQszάsf5%߻"ʐcԕ1V wltEM+.u]ν1o,0[0ltEqjҘY^0ltEQ?%ʬQ [Si#]Qwq uO`> j< ցwtRY@zH]WD9H/u]WfЛo!u,QNHWqT+uӶݿmC\_%OmzQ6vPЕe)*tJ٩&iwڋJU*U4=j-6]ҕxwt-oJz0oVէE8y>?KYwDƿ}nޫo~"J.n_UYV(B(FXnP?L*cFuGkleR\ZJ 8JaK)v.$Nv6`Fr;fiqp4z2جjZ9j銀g+UMhN]WD]u]nMuaz[ (Ⱥ FB`HWGWD#JF+uNAFWչtE: Q<w^(F"`GWk]Kw1Kg0H> 4֙uE>ƨH̦(``+5lVfO~0Hd]]5Cog"Xv<6 8]vܻ;ʼGZ^k+uSef*#fQӃI8U3v[r9aN s" 2L_ r2nrNR~_B 1c4&+w},r4R<[ t$ (pg3A^?ur++HMn`3~LFk(+ǣi-Y6"\ iqlueDPtEsfV tuetEFZEWDm꺲@YW#ԕg+v|tE^qJQɺ?sz2BZ'\"Ju5J]xxIWךz35幫+fL ^K-i/g5e͛jiy~/]r5Kw߉_TUWۢ>:?7(6+mP.?<%N/.v'%ւf|]O'wLmQbYXVBrXR,M)[Mە'ˢڋ;,?/kR.<Ogˋߐ!hc)_{Ҭ,est'jwdQOa~^yO:+ʷo0S9xU؈\7cxzjД72ܮbpX/ON6oWÿ&?ώG|u~<;=Uh:'Y!t\GWi) T&UZFEqհg뫽/ZJZZuS Ƞi/k͕ 0FmR r8ck]tU0>o?% qMR?/K쏮t?^t0aXM`:ᷳ]rJINbYˣHũ]Wh%U]*8Hm:Uv?`v q} I0̄ao"'zjA`;+ݟ[} 8/.l/'y~KTm~Te]?Ŝ2m^4}X^KGa0m~yݘgW?Y SNۋՃ^m]8rtR0K~oj⪠OXQղ+VO7oeN E):mUW5VЈ5mke_NҹҙZr/-*u 4eәF*ݶn55~i7xmxnKE< .< wyܼ,Kvwd3 T2+ϔZK׬[ZIU0dI1_ 63:;Jm䄈+WPe눺"*vp9GOV |Q{wNj}9Ib5\lF vp>Krxuo40kldAMl3)!'@~A;q 6\T 6CLY8iu0z?(Y_X'^ANz`lq_C\WPLˋ_d+G{ˋ=|qNڻ[si>!WMхBZUdYʕޫfN/N\br$(Z׫n`Qݪ}͢.kUjJ[?TRaTkU-ڮ,;![%Z||MF)W}}R;ֺRח9..WLM6?7wB:%ۦJSAg PuP5] TIUa>8z㋽>ۻx#n>8JRߝ?nJz0kjqU>/&/U_VWbu lFu 7[+ݽ]~X`v֭u ^B~\޾7?U|5o`R&ٍ*NoJ1س pxrSQ&"}V-}"Q6 NBCWdɲ+eDߥk/*U#BMxjƓz??ٔͿ觝}, vd2/;AP$lٺ,G/Ke٥%:Aґl?~B[wh-$pSc!DRBr=>lrwV(&-zJv>8SຂX#.0w*TY9Y2[T3!,VkL iPr&7^!x Qo_S=n:hR ?}=B?[ץl6wSj~6H;ޖn3 #sTϬ]nņ7"X*١;ܴ՚<ߐ1-myl?oGi;='T󏯊)DKHͶJ,&׈Х j1R_WE5[@4>HN("7RBܝ|=7|}ȳ jZȓ aL\]Eծ>I7:3tò Z3tk#)}CPB u5ᦜ: 0,й话ԡٜdAT_;aY1?=! !zFh Lr:gm E;U jJ4W.ϴu6U'Su ]K:^z5GG2&Xɮy>k=!`YBG)EuK!u>k+lԐsݳ8\BRdH>#[:I/qV:iDAr"%YXe-1@`f-5b2+-ĔkDHx^GZ FZ'xeTmv?HJ0*R}ۛS% yf:'{M70qQ:=S& 0ΥO7RTy\]U9KgU0RsӸ=k6(5nY* zҹMQg`J3lE1IYK3͛.Kv4>!v>oK5 E7=),"%J ۳rBOaGܞFWO1<+cr<ԁ.P,Zp]q :,y `Gu_#S.kMyzǰF9t6͡?}u{/s#2("[z5O;ྛܼ 5g l_Ĝ$|@4o<*B ϪK'X\c9:pswRfXr^J"׏Ow9@$[бG 7RU`C B4PyP!029"o{1 D..loPiYL׮8jBG} .ǗɲN'X?A΄ipWvgڅ[hz iFe8x}-S ڿIwlVi5? ʅpsѿ߈To1(m>v9C:7DB+)"Ġ+kQZosv|^oxdv)%TiVn۾ c>!%S؊;(~lm yl eDQo ƣ,>Vf'GQ~ۇƕ凜u/3 .cft~srt}KA- 3楠dDǚ5ֿ4 Yn1e//Ւ>=^VGQdT()S x2?CՖ=.]T9@ıDZ4wa&=qxq< ~2&upDGCaͮ/Z[S ^_fi.~$,HǑ@2^w~&V)$.q!VG@ԝ*NQ־AjtOP1#4+ uXCl ) gŪs$uP L8'ov _xuQPkKy|LE WOD- 9M3eD*BqAi%ѝ_zxbDO|C<3bllRhvRgN"ilP[E MN[:'*a$㇩ ;5e0I8']>z !la$r?u'"Vʆj0XF5]`AP1{rtSe5}%L.$ >;Fh%ywg ’wкLJ! x0TpSeKg4LS8:LJD? C1̉hf%#zk븰>N%C!R_\x֑\;vʄez["; 卣ﳊ )IuũCNr<ԝ%8##)b%`";aj'-ȑtLHGLa)]/014$g;|C$1̋QqȦ޻pD"]i[z'Lhyw*MHprGb#Q;L` 5[ gALWՋfa " y~ "<ޜm5DE-ւgq0'Ҍ\Lv"MG|K1X!0eTzny %G|ބBpyvaQ1S)RF[A9MUl(g|_c@sRN=;"A)$~= >"qH$lcxgL#Yqsdi,pLw7ex;?Op`<#"MgAqs{8"* +3`d&?* ]]~>Y|3-gB/5\UΨ;$ B*EcZvŷݱܘ)G 7DyE{$IPhE8ArpGB"%eU<'*BP8غ ]Igxp'oD&*\ 'JXkQO M,`WM˙`& =O{ /.tTV P] @UD!.L!?6gm3>u(PCry6=C~֝U bw*/*Fm7Xj|@2}rҡW-/ &H#kp?*SxpKDgC%W`+g/5xjF y@G/ge~A֛9'61mo/ֺ;r<=Ur":LLHPeD9j Ц\a=hͽ=ڈmk*OD샪e/ AR׌ P":99Ls8$=)V\T^{^TŒj2YSr4L@*SMP@?Tjd;`[=P%T:I+Sum$;!a6_ QlRXBH*&l=![J72o9&_TCnŠ; R!@BFU\;>Zw8\KQ '-i6>Ei?M2C3?9 IYS++ زJZ6iz`3i-3ZEUF䗘 ÔHt;oOBEQK@؋`gK0TaǓ^ب_zy[H{@b@~E/t|ZH4QcI=+C=3]/PɎ!#; _P]k q[hҌ ewOw^uo_vÞLbcK :xazr[w Ν4{ׅgtpX2*XJԿt' lԔ@Y mϩmv8ds.LVU2s&rJTBdȷkD b,nb&Z~2 !{1f?@֊RRB KCTi8=:tWY~}ng;z0rI/PؽCD1tA}rQĽ>]a6z?QG Vnj"^HnE|(gVˆpI@͎h%n`~1X0h.G|̿Ţ[ȝYb*&deϗp{ +<,ou 24m 1߱"j>6OW.lCIŋ^Α!p`*J!9Er^<;mdmY"n4zyc,908=V@^>ݕD매{ xVsEs~MO@uiAz xi] Pe*(e^Z\`?fG{pe7dVfm:[EWlXr8řcpzZ^ -;.qiRm~XQDDC߶ҏXK̏A}UrXuSV*wZa $)5RXYS=^Qa^VLٳWmFo,SO\otu*(ژ쀄f ͟(0:,Je= $ka?^YC,h߿~&y~OX'odCT91yyGDHmfՄo`5s, puD_H!+ :3#msk7;tpƢ6{[ß eA!PV4PDLRxY1$H0ۧ"Nĺsx3c ]/nTkj'*Dx(Y~|jQ6^~wܹb*kr_/iWm?mmO'|8!G0 DC^tc u<ڰ3lsŵAE@#d 嬞qH9 D<t~IE&wwG 'X:? 1pͯ^L$'SjSVqHz=?*$\ĶDcPl34Dpb%=~*__Kqae-c2@qX I3Jԇ**.r{ƍ$aw]5@p8drw{ dq)&izZ,_S-Myd%ٍ-MU8{K+Lf<( *CX]G_ֆw[^vx*}jY|6r!+p&zdAJO@q @M<~*̤,R2R%l4oR wTP[&VN HCJ{2K*MjQP!U״ϝ1eH]t|ShJ;lO:(yc&X]cf!L `1&xpUZ b`NB䱣~H 6MTtgĔaf@G>z biѠ!^hi#dN᠔"jxwtrj27ZHEΞ0Ohx=1mE/7J^ъ aTzТ *p" !@-Z[ .5s$ HѱXh l%n^ jNʚ[\I Bn7}̛gS!HiQYC2k$ SĈ׊K?V-{*x앯f=d7"ft'$;J']Y\ȋ InC0읔l%#ou9rk}F40~v^FO\iv3M6.{7)"Q3fg_I:: ݥ+3 x?\ngi1b$Z!|Բj{Ӑ̓Q1{i(LY2v F\kEʩX:~}`bh>/9"D-k=N*?,ⵖb?V9^k6|K ۅqV޳xEyܳ &&h>R5g|(#X,z;/n5\,s(7G;Lg5`/쌚3Ӹ ~M tUs9;b2Zܩog <t\,Ty |N~̛@QҰEOW |E;fsozxѥNIcOoM ݰ?yi_ IemtiL":Sv뺚p =?jؕ$(t/Tm7myZ^cxszj7?6~%YĻh v+B~ ؉ t]9, BVkU~7ĸ.Ȣ?|?P]S$[]D?9%qW]*8-aev}s=͠odw4:bQg1(Q>pt~4ZƷvBi2J KCRevޗ$O o(b{ȼAvƃa%יJ)3)LgTE.0L3Yz69r%ZS^8"ՅN&p"eȅTn$֩g ``pa2\5=F7|^um:`A,ީ L*i?;m~S;F#A zY_Be,yވѝAnvRF+2H ڏn+h6O%Y!S`-nI۷YЦýdavu]lIIb4lƌw2Wf0kisȭ04PbM6}v?'FL;lҔ,CPysǬe^jDoiijO2W1RNMū?wUZ b`z@Ҍ[_LZ A( *^ؽ!;ԍ0C? э9mK6u: GŵG=P(#3|L. HGAF(Fi,/'3>aP:~eU|u7^lJ^Zq: ,I_Ƌ%3;*ݯ_G٧Oj?]9S~7 ݒgo¶ߎrDsG$J-4{%Kq:(v4+:b0^Oɰ7=X'R9wE"M #6CKT^P%_4 U{f(\D W5Y%EFpn:GKvm5㠟0;/`rX Mm/:nsH >sXꆟxzY+WIe7&sFОr“`Ӿ5'^dIW #Ybp'%ӹ2J$ԜR' r?/+}_ 9!4AQ}8W k5.ʗgxǟ#-( :{e}s1vq[oXBuTFX(3 sIzZQc CP)Pv:hs,4RE`/4cW[EPm4-YkLkXZrM̷~fRcQ{H8/#p*7?|gKp uߓKf(Si*zM笗_|Wj2퍧W*tO||7OfoJԜ>wo~oC>k)E#%|6Il>*q\Gٯ8I5|ܿ/5}EUOް7QQi"?ҀwJblιVhЋl C`גuRV25vz0ڹM hZ-a)-M=]yF4*Gў-I^:s1kS$Z|H%3+ Q`n~ z SvBCid{;XT72.Hiª!Vtt<qT^ j 촯6݁ܚNo_sC2]vd'4 1Aa'g픥ݕU!XgUF٨,(\bxcTT@]O"x-T89R2VAz4r:FW(2۔oE*RbuB˯:U-LJiОVWt̆S5#\sLnBBMԴ,D-yc)ZH!Ĩ+.ZKtNtcT̞2 (LY2.DHkMH9)9O!Q9$-*k卬n]9~I] MTN!v%:>=\ +hgX N%V<%"jhRsh0NSBF2{6{S([&q[L4C-!L^kv~֛ Cilaj|39;l܀8J)82Om5g|k <2[YId5g&tw۹7o H|֎՜q+PFv}L##+C*#ۥlE# j&u&7)8jl:IRʨ34Nkx&˹!.VFΤGKCEWv E%bҧFH蝔~BA%a~_"mGsxZVsw̳L+4 Zix񸞸|io_2zɢ\gYQ؂H焂rMS`,-H/ 'V ԫFyo]7wXd;fi؊ښ3 DJeh*0.:֙:Cr5M@`ʜ%5gNΒ,,*uyb>粚n!LEW\Ц cxYyi9ɕ%e%5g1:;Hik28Q[#F}K"8ӧ+F 5 귝cqf\=N .87HnrYPǞ֑ Cz.0B8r]Ƶc߫SjpNԹW\Lq 헥"xMMKd~;2miB:<~Y}uoޯި7W_B9-gRH̨̙Ҕ9Ah  4=EȚEUg@F@i=ҫ~ 1]W3>K+)ۻJ>ӫͺTLi6p+IN(Ny\{Y{Y{Y{s; 2ri{gIʕ.$V9+YDG()Ȝ9[^?{6Šb]CowX`L:Ȳ $۔LJESd'9b}sȃCd{4V7|>]ݐdQea)5mf 2;A/Laz3Ƶ uɠW{ {c[[U/n:O'}|H]ĨSl4\v֏Wme U@X;[ ʲȒ(ֈp#cnP1ױdMm(nn[W760uT=N }'%p,31QD#)5i@KxiLj{)t B[B:p0z| H*+r(Jבq} \e\8F7 »zNFXn&ۃ9󚵴%U6% %JIQa4BwK.ZoE>nozA`=_$l:&eԑuOy}VW@Csq z;q+t L|2!_1~;Lru2a|W£7PSw"j h#lNz3䄲p$x*S+v /ȎgmnOU x-逿p\oۋagb?%#.@mo*Cy8yftJM]S Wj:Q#Y5e;g vYϢ#^6<"rYw%c]r+\ XR\>x;a h<mVdaI[Ub*W¶wefȏX^A֋Y^ do JfFL"W n({D3qL5ۖ3҇{#abKJyVCZ2Uڮ*g+ŵіZT':ULop>fW>Ѡt@ƶQiE"q~yR_ɡg_KwA~"*:dT4%%g,9 7R F-F, G&VndJ6[,?Z=_Nhf{W]6r{d:+͇d>;m8aO9T_1|Fߓa Z #[?%?xYQd&so]8ݹM %=$J/h />Y˥y1eCI>:3jAjMGzP] 'Z#xÓ/gCtj)@Xb&\ϓ׽ I L%+9LcNUpD6u$AZ27EXyC &LbFw7o:1)rO.KD@=iN]'JoYʖN)h 3 llSx)2!hH,~^MBw;!b˳}W9JHC>a$ڒg(P;BWg];>wY Kidc!tnվ 'ܳx<[y2'3Ð7%KS3HetB2p#L9W%:!i$n}هq-\!ۼwq/x]U4T'p3 h0!q̠Xao6Z˗6^k~8O_XbMvP Gfi>u@O]*^(FCn¢rXZiZgS;iLH;(I:<~si@`)b`pV1G(/#L kwIm\s4?|R\RW*g=NJcvK-ѭ$Vmnl1+7YJn1/: &;к5*4g* 'apyLc[D HzIܼ tf*t̨'-Ej)l8~c@θ ( ]GԪU8>]|K%ooBZCzG'j񢍝C5EEX lj]2O+F*QObBJ},2`\'4SԘ F8lT4ۄa*~^(r=_KU բc)G{xcg8\By{hf0lG"D H* flT;g(Dhדgo VZ!ͣ-ڜQ.oQ@6ݷeֱI<"+IBd0`;Ӏz:# ^̘h*j !_=˻P<78;rgR`H: M6U( j*8Sm=2:Vd}.'[v\f HiWSR Vx1, kG_7vm73وA=rq80#vңqᄲ$C+ra d3eb4OU Pr~ 0FvF,E"1N( PNu[\Sz S=CFF~itDNZ_C*q -:l1DIMM(ۤ;Ѕ#ɛn +T=KQF)vT@7@k>uМb+k9P}^-0W(|R£#CSLRdARJhlhY,lbdVʋs to+[Jx\2@]WƯ|_e^sdo -D<辘OuP.")l۬M.);{$R vVn/ZU£A=:ߦW lhv:C4vS1؈H-9G m29r.NmDߨ (hn .dnc7.;xsEڕ8}YgTn܍BKxݻ(3~3$eW>S9: VΣH9"~=&9QPL \cAohC`?3pľ>o]|Qqa ah }F~RJpylQY ,(Rֽ$ؙM_ZLi$(I1Eg?}Mo ӻK~:WVOQ<&^ KUѻSjһŀa6wm}-\-'ृц2"Vu0dr8OEkNP$H J4H:aðoDִmЦޣ ~hPOU hʐkl;bx={h04d6NL$PPMW,6E/"¸]eA)"[a&A /?EL)"=焄eX4_x/W !_UH]"QatHt(j.Ƴ*#aB&YxK32 X[tvn1(܂%i2>W4V$E/7B Tv&϶Va+:𸼲 I[B+‘ V%<nJP'f:=ܫmnЉM&iV)jweٻ6$W4fKٶw0aAiMXl7o$EI%vUIITŖHWTFdƕ_$!qZ߼^0L^b@ArX].ZlG?{ 1m &'[ϦA8I8pR\ FU@xnx?rwiR]?㲝sA9R)UVp[~FW =ؚn2֌Ɂftfv,gdtunCD^?nNjK!Nj^D z֩YQ̌pm{6yPic85xf/Y e ՜G `?s9߾_G8I80IWa6\i:׈\ƒ/.3r ahܠr7}ˋ <#t1tA`NnƋjn>-^Kx~v}ޗV^[SxUOC= <Ӂ:>#Z4 xTipY=-Y G_A>n귯NŮŷuʄ]W\,1[qtjx@ewPV5ӗ&_(Yh)L_ֳ9Vs3G mx(9ĜftoEWW7 `,=^3!YDeyŦ\Nҳ $QQM6icNrgBoz x\Ƌ-|5Qʇ`L'"W?/_!)r""<xƒUxC%^HgKlKU:)D?:E7kVfH pf xxa`-H9d\eZ/]5]Oa*MXBh>DYm4_M}*6l[9eVt0yɴ孯;wItU1ƟU>LFŞy[ȝP?W"Nf3T E‚촮 Z%{LۉܼhbQԼ!ʪ@5NVƳc1?&{m1@.~jv"]fGM4Y]<&-e&bEd֗vQ(@%Bl hX¨kp>=]lW5w[F#uZ+Ƃ)L@(Ye9ېn^lٷwe9Me WTO`S{7ΑžeR#Q.Y& FGqQwGgh>Mצ|E>N&fo:`X2:~dIP{ȹuEOdz\o J"Y-](R*)ͺλSM~'[I[W>(D}D&%ҝm@.nvcHtkcv'Q_^_][ !no9zphNo|@G= | =nHa1ik_z(R)י%o:ZbcoG^~ XG0j+cM1bZ=dRc*!bE=nk2ݲʧJrLt][|J 3-8i5n)㾲-鹩6xHgHQ{yv2l:;'HKẽUyH1k0ܫb%V z݄GIVgQ1 v*H-'UPJ# de qQ2&kΈgNb1-;E6 Kgҋ<5@%t:6lK=5װw ̩93zNsZ@>w?#Ut*%uR:V/梍"RPUXA_Y2NzM4V ^`FjG3xT&nR[P#Z{_]hK|Ż^QMRq`>YesE"NƱwⒸhLy-QE9;FWwWwrGȃckB{]fMKT[I&d[䩍$Ȅp9=d9thVLmI"C1B@`9Q"MVZ[!Z#ulKlK|c 1>͛z&{;4?|B6vS~ge70Ss0w@ _ԟ.UXN;m-R)RF*M_T\.ZU/QID"^"Á(&T ߖL#pt無tVa w@5Z~t+2 $e^YfA"<ڱǿsvr3~a9\EXI?,P sBFC_ŽavMw#B5qd^X9}q,>n%_uN9d;ɦ-V&#ڴ+[~:nNVu)M$.QJruufap:;[:,%FX98#*d!ZX[V̿eADaŔf$%6Ilw[تdtZ"pSNQmAd lse?«*u@@o+Z:~rpBҢw+b W"aŔIVS\Xݣ(CiC14CpB(F ͕TL=*j19g3&Q!G9T|QH~Q[KBZoaԲY_U߾ĝi8H{۞M#NtlH@p[9T|O%#䡶AZN\8髏S^Zm$1~y ӱ+4N;V/zNօd^}ׅYxkh"Omy|&5\a"U_LcآDf  Sї<%kRp-)[qaɭ1[)-.6m\+:,`i}p_6|P ^3Y?jYX@M%p| 7&sqׇF}ӶjS FtYe968G0^ ۧ+ >AB^ $ G:,z5@RSei̍-B {@>/:d?^ I.}R>!$jOy85>|Q҉Ռr+O0aƩW#gQE*g/z6DD^x=e{D6P vK{۲`b78`XbXb&{_Z2IԸxZFIҖfQ@+fnDžJ-(< @R*DȃӁLDC/1`:)R:'(D_h%Nh6lmvl[!5p "18<̱ș[-sVjA"ت73plh0K02%+]_ub$ȼ ̃_<? (!dk EˆJVi"W'D9>*B#'9bQbe1(f.}K#Φ?Q"H_7qm!xφ` %$ 1z)YL7˨e<5g57օkC8GwsN1IwW3U%h4PUF*"5$a-AjSoiJkK! 8xa9ltҷ\QMp.y7]lx;U(k;ݚ43&7B)Jmϒ_xCLԉV>urq$,i25h݌eD6cf#.ۖdВR>oLN j4 YC|ȿA z=}]I OF -A~ǖ}(l5["ۗ2)}T $sڞ㎵3Hy/O1V`*b9V1E̽5{_ I՟9ϜO_ߟ/.lxg$]9OީM/$ǃ?F;}7cB9]cv8i O K2f܎hŤ8CS10V}>15Nt+̾$zV>b$x$E-m9d-x19Wƈ(+ۧ[&"9&!ɹFo_b?qy`KhSj|I9B?G9sx4]E י\xݩ"zIFmI%; |ܒĴVkẔ pH2VDǚ~ѬX/pbW5?,j9OYJ*lRA跕NFj!/o1.qcpѴ"$6cq'_aY=j9]|+jgF%ن9*,nv; ϽW`!`E0f77iXBvdB\kA+#ZIG_0 %ok(g}d@-0Ɖ^lAq}U@{Q*BN$P$#kʔz[YAXiGc#3lB-EbΣPn%̒)ܶ Ϙ ZƁ]~Mo ͠ AtcOܫ1sB%dM:݉:;{&>-<ߟ~^ޘD8D |.7q1 meHj)Q=|P{Q)[I5ϙ ~.R"!A"A9٠{G;AD!$VɸQΈ6:1kDwX4XFBqv = $v3bm׎ l^s#sgh`'8<$`.AqciVEca =K1_p~96|E:B<YUpBpzMqb .fL@} ɐ5Yje8VS!6e cuC 6ğJlo1It‹iIՖz$5 +T{QZL96+19Fy~2 2$RHMk{g~?ac7ۧO0kco2]>7]- a*1Ѹq갂kOJEZJ%gOwΛ\>ϱW>t;/c Wq؄_S:!O p!0ow5,d{iKTun爹F[q~< 9B`C+-1&c*~͐g@}n6ɉh[K rwf ?qBܼ(go{+vzo[w蟫C|.t{uiEy\uBc.շ)S}K!8bg>CڂQjFwЀ`雅Ǔ7{"$>~bf'b }jdB4~Lۺ+дYp4ij(UsQ1+!¹+4KWyy}8ʍMMh+ŘsJ'<0PXK)O櫋5a'R {%٘QVQ4K励Mௌm\.AQTElo*9'!H2qZ* o jػF&Bcf/Kn|[.<Й?.K ؠM.6$Ζ]Ǒ&@Q/5Qsx5QsxZ9| 0J,VpfOn&;|lC󨗱ұsF۟iP̧>6MNwWц]id(-YCe 7 yɤeeN4e)7%G=,ͨV Cjmng*%zZ6;  ))Ce5@3f&uO#2^sS5VmbMҒi鐭7M5i.r9Qޗ'!U\7Kڜ6FDwBeX˔]uؔ%:Rj)H!.ZvLfcK?RTMjA R^Ky$!z<]]QmN}7sCM\WQ.SD'Kd! 𩌨!f$DF:@B&H*Ⱦ@piES>&;NO'{_FiFiAY!gd.=AWv O?SW~k!Ck]Nw-䯊֡'TtѤX\u dcr%!s{O^ /bSLՌ Jǰ)I!.A ;0Tl;.A%HhTYO;HAX^Th3%p;lfĆ UK.f'É|o'3{1c3 7n<0;åQfuGEh*b}TbϋH, [i֏ST +"'@;Ma8s#x? lHWGN;|ExF ޯZOM`˸Kzn|5nmmQ`Ie\w _ `ߢ Ƚ++2ؑ_fȳ)P5*[;G"ѣuWALv^+[C^B! Qչ!FHb\,]j,l%h3vQ-b<쇾;y,Ζ~/6̑3V~D@~v ۅo_<4̭Ngy+|Z|Y =\]}A11df3qaݎ0[QHGϺ w|flC}w-+2VK4s8'<>uJ:-9(}]:{~7o/p]yZ!An"==:;rn< qWI4Hb<ynfkpHdy?N h+o%`f![z y>r,JaS~a5ye_-[0NgONHmwٸپ#OO8=~HV#fT3jM>@z OwHx@Lg7՞ZcfK&<þ ,y!:)[7tӓ3UWj]US/eFrvYs4a.>2)un1wۓW,=WӅb1_8UZ0"rVwWb^6]^7qu@s쾺:ћwx6q}䩂ďOTf,rE[#N[Fc?>36R@87B^--2Y?J ұfG](=ׁ-.3S/&:\zKdYSEUF#A˭x(XB 7KQ8RBc Xq,AG!__5tT]iM&VK#6B`9S6G D8Uc鶹WQiTٌ%]m dYFsjA:tz:QNJxeS$g%T_È(b-]zZM^z:o]THɩSL." q/t/ "dUy?.iđ  *Ubqu` -l5I -Uη&Z$̭ăVSJB5Vd^"rq!X sE Lߟ+P}m#8nP:EsZ>2cNά>ɥiK.B$pޥ7ԝY"\krMBTCk6%!>n2s1gE^rϳ ^H1OxׂqMRJrT^C+_mGsƣjhG,#8B(~M9GvO^2&yfLL͜čdmH"fL;G-LB1XWvr|$x$WhBM@ BXȊ+iJNʍ B ]_an>삷nWe+ ̝ѯ͋Lx9=/t >㳱7b0:>v:T] g R7}^Dqv>U N0__#wx8E$<ʯ~=~5)w+zT~5߄9ʯn9D;nG*fXoh1Y?:aq8,4l^>ڭN{v{XM3G* -37ШBoa'YݏtU;7Fa m^cg"yY 6}Zwt8 37z{]Vdۼ1Њ~pd;r}c@"e2u+ӊ3})޽Jk?VcDؼd8{ WUڪTU YomıC TWӼ~!\C[=@PlmpG J3^eRzwN٧gDַCo+RY45A}| cTňU"qV"(jQbKX]Z )ۊ {m,jR {_^^2wW @uORD3y;wNP~H w\ӳuŇ=Tܸ{SL M1}B$ī9趍n:%f.:QF{{RW5%[J|lBCJN jTrvk۟'p7D,v~Oj<4(Q;̈ٴ޶]96\Ep1ĸ. KB:+qX>BĮuIGV?|کoŊzDz&E"I%z`J6P,6mZnV0kO{L AΨ(Z4Ba$,p/OQktXj)* Wթ18ZڦU6̃"=T5XQ!ƃ'Gɨ"%Nʯ$&tB5مfvB%֛EN-H(딡(e@oJIgv9"PoK4OW9RTgk>Etv-}w KͿRzJW&nN6^i^Oݹ 7@U횸9ykҸlO[Px~޳=K={ן{Fi i[L7Xϓc6+B8gi8pE &TW]S-  p,L+]T*gq4ae~EG<` Uʩ:Q#Co|zVi`=PT%- [GM J"9"izzpnE2EiXFӕ+XPLʨC SlTlЊG +]˪NK-4Xm积fk1*Qټ<]ĨuޛcQ-BƤdiKoR$* A:UgWb DXbȰP%/["}V[}cf α90!g߈~E'?Yxuֵ7: fJzvH*.7|tEG.*`mCrt[v{Ŧn;Gv,)21ڛS/c`i=s#Aco_קNl} cK֙ }V!+PRWmYD5l-BlQ*gynE?6QPQ rBNVؒ֠7HT׼qD?[Q+8灶ˮwA Svh{<ҭ쯂׋3oNva78 Fcxk6UrI%?{?5I2F&q.mdϽoEEɵKV Ն5@b[d9{RmhиZ5%f!䎕 MlG-&V U_j'c6cz㺑_1 ڼ%>I}Y`ٗA0XHv_Ŷ:>-mwlXY(FU7Q8u !56z%d沸DPϝTx1!A0>]o68 b"|;Ŋ& YꑸALdK HgJę z- xFr{NԶ_zH" Aհޱ|6L2i!C GZiPТ손cBf j~o!lr8]W?(5yU8iv3j> 3+mV!.5[LSP u9WLd9X欏_ eM$!|'<9m^ nPxQNv-aN4^:$-9v`q2)u jqby~?i#@E1\ʡf_&ȔhǛ/:9phd;>WP OI}\.y32SȢ3Doz 8n5O5(L\)^GlAdZ:3+, j1sb1>E招uTbt Fq֣}C\sKN~-ebtV@+E( ]ɾ$&Lo֤6^3ǺAߓ3vI^/Dwz}?VHzb s.ߜ]%PgmIINyh(a%Q}.)6-9El`K̶io5ڙ:KRX[Z<pOt-'l~ b묘mI|(LAnR͢+C5}ɡV6D ,(.,Aosf_wn뗛!r=hFLd{5!t!$HEo1AwG5sC 64oB`oBIho+쏌:[6&!Zߒrܬ=apx6 /#>|^RPnϊ'(2Dד2v2J] Xr7{M~}_k~Yr@y68(=i&̪UP{&^FdV̪=Q娷εf%[&kvY!yc9- rsUpE kڛ:>.^nKnrx y>9vDp.x 9RW֠hS&<#Y tewzȳf ȅ}薐#s { 9rVMF!Mw #%EƁm/}ș9Մ}% <a:1N뚐#szpN%_WrnfG=s7t| ތj%K-ZkBn/õs3H% ^]̲xnų5v›E/(Yۣm#˩?.姿 ҲdA.W7a r<{/hSXY~|%nhffk=|k.J9&ͼ]I}Ay*ūvy~*_; su0o=fQ>wg¹gWQs{Ğ؇uYF5 }.h'qm=٘u4VMeq_pKȑ9C)熁B˓?y Xؽ;vs\7ߜs>觹޷$.l׭<ˌB]/uJiO)tz)y_y6BU6n;fG#s0:iFҲБ<&cؔjY^n곾p-QڈYg1}LSL n)EHKx-vܹ.gW7?mlxT.?*dC)d8͜j(1Ըp}ZR[mEdydC}-\uk@rzv D'V0k`} Ni uG62D|b>E%Z\ <%UV BE5Y3J9; ߹O7?P!I1pM]IjE/*la!Q.ZX=-;h9b YlZ&cštgʊsLe]n@Ra&q*O$EUąQ"bp)'ݦtDE1gƒJmxMdQEYt Y~+$%=V(ɑ 93[(,?)FV#`"T|\k|ػz{Sjm=䙔;rx,)%}h63c|G lb2U8LD[N˾G%US1 + sI:S5U?F[gyp[ (Y_D&uyh8HK_;>Q4f/6֪*`TJn#}8#ǝ9_i|_?\ @džG$#i$ |dR@HZ8r+1dGHt[{heK~b93!,|`x^ z g$~sSg0r=詤~" '!3֫8$PU˥ U$Pot0.=-FQ[ڴ+R wzȳ 4ⓉFj΋GȚ-!GDR3)~ZsG "ټhl.LNz*kmF/{K|40${`g&ۣD~D'3 b,RK~l[MVY/E c`r=]hdZ3Ÿlp˘W 5,Kk( nATHuErvpMPhxy9xζu\=#ߩ0p#w|NLk?zKf1ll`.Sr&V>G^ej.ةAX5i$74^GH&}W264{Nd=vcxqmg e,k/#ީ0Gr;lk&˹w]GUv:3?\kM HG@RwScJnFK[\P)KA^~fURK +/,N+Ѫў1KM^OYwZϻ8Mw-y1vJKe-l" kF.K|G.L ,P q )Pi4qEj1hhk;Refj {@bBS #3LB V,mr히kg UjNWrjg-X$׮]Y:v߲ܵ}rQLQ6'4HhOYhitA(.@k>1}}泈6MiW| cߝwOnMqS~}$*IsȳAU?Mu rs񭺿bAY%h!S82sN ŗ@ql;qY[G󏃈>*cI%I H.NϾ FGfU[ Mx4^.;N-6R ;z(QtE\T %8v{pҦm"5zP=U,T4 m xD1XcLKNŎ5_a,9\.&=Z#lkҥT&SR20XJwi:l\ߝP z@XN{",5n䒧[%YM,j>9PзL=L$ԤD_f3-XrSOĐ2ae#xE? Q2L"LCKd:ۨ!zl\wi(iL{2gJ S⦃d+C,W HU)eѮ4F]l܃;Q?"60Uk/d4/UΝoѨAl5JTMcrfha2UZ-ٱ2?Vc|y,P /(&Lȼe|@OKۓEɐ|9g\:;/Jgs}z}TA9䵹*t#( 6jmVY-A2=KOyg=*6q-I(f^K%DxIyO+:_q\}[_yA\ ? hI\^xWq^!PD0E|ְȫg79ùWqX7=^({yGʼn){"XwKsMIm9s&V@)K\~Cdk朼هf\; XipYP !yw>0z*ÀHM5'+ҨeDK{`,wO]rZR31w&ט5fuyu"]cވw(MB9G3<3)R` w)ٿ W'[̘볈Lhq8|hW.6V8jNf09Տl3:"c@?1y$fiE7V{u##փ N#5m8gƝ+7<t\ۘt;8NӷlS,:&zZMb?mXi(]K%`ꏯǮ6GH?~㨛1^72?ru5G4ΊB-J\b W9 bf=S9EM:u hߟ_\Ɔn} VtpJڧ=<MrIQdj2>$rmxhwf~|ݷ<}OLՆ&Ʒ2:`> N?4f\BVB"2#UG?)@Db+ý ke l;# B [,Fh F3r]8}hm5BvܗߦX 7s }nFnЖg)7K"BW'DH9i=ݡդີP÷K9m]ث bVo{}3\t9ty N 2<:ƕH[Y]U踦bۖ̇ynmA =N`^H{aK9>.Li2b?,78^ȃc]J F!@is\ȜcIa܀%oB*YԸp<;j2di_ 2Y?n]z{ޛ=s }0:ֽ뉼  ֘>ש?7?On|IQ+[/0 SشPǔ3:g\^[N*1Rȗ\ P gK RH=A 9\pY-mI+G' /h]f_~;m*}"z#a E.F^Uo=j@ɝ0D`j᪊.s_J7՝~YYz>Tڕw*c̍*lΑm6O@2⛤/.~P9)_g嬇c mܿczsXzg?k8UKe 3d֝v~炏-T-X5y(ӻlxBBQn='g~mXΪsm`o?y^Aigh/lYUQv7mkmz5^o4S#uxguc}̰DyX$S֕~Ճ";ASEZh˱#cLZ:M-U nj=)8Jo==5hڐޒ~R7[zժ2Zln0>NJЬЙŋ7?aZ{Y ëϚjYwqM@ɬFSnEv %+?i5ӏݬ|kxny4i=i 7[̊A:Ht<9_R8CF^oGu[9HkCu{+p~DDdoeP9wLpDy_lGci߁r: BbKZ~۬u=hރoQ .JeZ7PEz./瓋OqT* 2jW- dp>[Y0E{ FNu#y!gUo>L xWP2F<׻;жRa֘ƶXѹ='.cW|iPL+ `&Dlzi1E(Q.Ulϕ, gdHDZa ޕ#"YA,·`H4#俿nInɖEՒ;qn|X,p? FD4 N:R[n @4(&d6RarAB96 |=glլfPFm/=GA1J: ))9|V0d΂$G9@cJ9>`* L^ m嚈`\\1!+r >:Rg2TI%Txm_EI(-0(\`XօQƺ;0 Snx5 l,(\tqj(+sA5-OC: m ĩByց&|(F2 X}8-z]Z"M1texP)L(mQpX!n-i#Rc$9t`~#/(6ws5B:k4S9fwIcĢc⅚!v )Pxǃ 2l҃0Hx`!boRuoJUGl;=ir[J-;_ʖ; lN[/ {'ؿ’#.OFp$ FШ6{U.97Yp{e.&.ZJ]E 6Ed'e#.e( !"P&U׋@յ"P7ş"P!(`(m be:F}6 kzlʻJPd=882Ϩ⏉3,xS߆UK[YV)*Rx?IˌOL7;RI $k/ ѨGJ&-Iqyy8Zܐ;ġ'6eTqp1XHMkb'yH%jjRw#ȀHL)]ƬY$8 sce** -Qz]nJ"ӊwl"@pDaU w+ 兹M(R=,:\D$$ZrDuidJ`\Vc{ł$:t <_w'/aՂ@љt_Lnrm4szzSfTŒb)zKB\@ՂJ:1yq6j)#^oݾ~ϡ,"!wZhCd:Y]zz BS~gƓ_ZTwSߗ.+VG1Lo9iW\ "gΑL5 ՘~S%?C\[tWIşNT6j2O'I_N"x%`3zm8Ӛ E{MfmƩV`oLqgEĢiR %aE*'H<ɺ0^H@; iI65_l@Hr㙥c3鸹ýxjzʋfm1_庱kWyzk8 N)?j9 rou1 n־v1ygϛeLK}K$ad>XAI/k+'ἼtM"|1 9֠a1`m2]rx=e 'ֿfbEÙDMEb~nI>q$i-ȃ+C!)ݓ=iuِ2}|KfQޒ(>Ixk@K1l}TE>"rg볈l,"[aMi= 'RH:jů;ؐY3Q"bX-#9Si<7 Ǒ%ZrGMPBFQ\E S,)BsT0 VҀD#K¥c1ߒC x.5DM0cC]Ub,/e HiO>u8j+i|iLq 8>fdY+odq&Ϣ ?, >Ì#'1IG1cx˔vy )wa SJV/nGz= hP UEE[ 1L.`t-_ $O❭g_yiu_4q6IӜ"?M<1('YY3SLݤ%a!_!Q֩9_$gZ7'hRC;MT:H t_UiQfs{T6ϡt7<W{ܺCIP!xxY8U 帓yd_B:U104@6J(SE$O cdhGPP͎$aq& J jĵ{8t' S5,gQ =jYU (0G4D b@pLa Zp,x906&Dj(}Y"6gʷJԥ/acQ&DOD9WfKBcx.{oA:[MIcUg Wb]q-{·cd:uܞF`| =YJuN'Xlkdz24nx5 ,"ee6TĄc9^߬j&}'Jqq[rֆqlk| qޯ5SA"2ϼ"c!gd91\)n4ϯ-w<jcxT,Z̕Gn>8ՆJZu7&-b0U6FcSFR޻FRCKXɰ<~ ϮBX<sR ~pj N;<Yupi%kK$uæ{%noʱ.gR `xֿ,'9RכdYq J׀=&UZ6"U4Saa3EhM( B+ϯ6xp;Ꜽ&ПsXoO^.*G`ő3Ew07~wrs W| /{ۏ/]/.'Zvco v 7'Eavnp"/.MS {o/Fc|& +{~ܢE#YM&N]U.St^xT.dMqLEf~~wP fgy/O'7i1y }7u΋ޕ )l-]Ҝ|(U?}{`2(S0%9;UuW=xo{M̌D4iU"8ONJ·_zs65s-5?h6su-*ߜ_7ٜϛ|U~UE}k?ˀdc+mU(%VW:h.OӰ[m|qݚWo_-v8`fVx zW; c*-$$/GW/.Fo+Jݳf̿$.ϻg@U/CdcU*%܀K~K>q'.ًK꟩K&A U[[C%et𵪿4.LKNg%)o |Y\2]^c*Ҳ %|%輪O\K>q\rŠ_Us<^%Per +k!WsZH,.)J>q6.Ny\FwSF5B]&3} y_2Cޗ̐̐fN2(O*E grQvȃΓ65&eP.uIA9C.[^%zlڔ4/-EU`Z BL\%FNhQ`&ET4꜃v]6m_4H٪_K{P4os h6|]-XMvP[rYh%Y; ۍ& 0D'İör.Y{]@ @F gkoskVjrLBm0%U /ġ[ m3xǵ=tyLt>氾XtS^Z+$6ɱ%w k s$Q2J0+ U5QZݨHKZjmQl"EV$.ȵgWB%P6R8ZigZ+Ւu!- -B'Bz-"0fŔ+ɔ,3YD5 {Md1 H1tÄۓ㛜ˁBX|t/{nCj,{cdq/-S,cY,s:y5)Fъ}&l׳^Hp6\I7lk>:G֏60}tLaMFNBkc"Vg@%y܍Wq{_ ޗMfYoin _}? 9l+gw_WRtx+)6Lӿ=>-8O/2P'N/vs>K/BfBVx1u)9n~:;W6 Y0o~giikR}M{ zOYZDjDKNx'g״ k`Đ7lD`י#b&I%vu6S+jKqWz5 8OBUP-)FIP&#o c3L]{00Ըc]u-:vf(L!R>(Q \iк:ŲoAV+ H5f^m]<{5k!gQ$|bj3E4#1@+Mb+v.+:Hd(%?"ޞQf}-)*IC0tkmtl8Rr0e`EN ed'+A1Af8\K9ML]C\V MX3^ h5s[G3a)Zd(|;ÔF[ΐGiؖ;)QoNh^0Һu{NXka T`uQ%kGP+KiEB`q>*m7w-VW渶)rjTzo7=hrJ6hAsM 类McAXs,3&vm6KN>J'vL*i^+cc۪wNHk ,|kamν4N2ko6|p\;.ԢxR+ΊMRAaKPL'f_nmm ޣ͝6cʤ+Hqˊ'p5 dҦSLY!,JZxwm cZ̾?Aofwm=oxm,)x|n oN;pv~u}'>V,5:X)h;NN $WǧY%6́X ^v=x>6K avkizs$k>?/?W?XScuz{}. (&nB@6Ws屲_(KeoIuzqtY~s'agGYl I8*ޟE WE?:pV`?rF a "uk"g :kp _|¼<ޗ%guY,'/)sĺ0֎YM*I2MF4׹XNN+_CPť>|JVԞmjo(Ҫ"]IK+HŘJe%MJ8WwA:қ vČ?-%}OlSOƨ($oO;;ߒ";O˷z!g,4<ܱS JO1\ R{S0b398vV+O?p֠{Y2r3ȝ0 عv].Q5i4%T;Q FH١qR`mjĚG:Z:"ٺ;foL3QY۠ܧ8P@z6|d6h7 S~kskw`n G|yj^$ggGcm"}P&+51$rp A0^G[k(A>(*f!pm\*#&XqAȥu sE ,@E˱k U‰%AƖ{`4-Z5_xwL5ф~惃F:q7.U%ݼZ*p4RM7>F˿ThlBZnV޳>?gEAJ5#~ن7gͶ5箲^W/?{87i~KcGK4jjivr|E􂀄~ν6s;Ӗ}VXyZ@삳`){!9MΣODY4/ɵb*bpq^=6K_8^\V{vW+w˷0w|(VoU^ى`q/_r80!`,|N~wL@M34G9ölw>9Ӈ_Z/G{q|+ ֖mkmOlui讬N_~n{: 9;>D̑i'Zr~vtVV= I5*ÎH[2ś.ڵxq wfk oxǽgl'[ Ɩޝ6]8z竰}(ڡõV\ؽEeG3cmp O Qck NPh׾F{. @#@>p2*5RJ%< WW^at5EiyE$,h̺R/9};dxorGƩT TyN0]z .tP%EIQvZDBj"Y$Fd)Xh3ڞm<=Fd/ӑWJ+qg(c!%dI٢$Jn&G`0cY|TׯU£oH9n s\p*?}`%$_9ȥaJ#)KL3puNX*#I6 ҥ"oUY<c9yD䠪c1QyX(n$xV~ҴqfRZ kI9vVc)Sq]&䃳pM7kw\Xi3ub]6=(;1%%h>ă[R{"\p"0\$O%t QCi2! <CH :%s!3ƘD4QZT4:dSiE/;^О3?h҉{E\'\HJzF+mNf e7pS>plrU>AOIgLZXZXWAٻߦ;KM2x:,dY/6Ǜzl`[|VJu$Ƴ:]L(M~a T.MYYY(hG11OpbsM8ߖqܢ\Oh+^8O(b=Oh?Pp|4Nĵuk}=nP!յ@;, @@춤Ĝ3*XP{:)x(y,X#,eJPǜSf8HBtIsB3!}y[V%kh<%&#w .[:Q\1plv'y9ׄcޥlڼJ-UQ.Te& q8O`bher 3hLR,(\ߞjbNhB_^8iYM<r)(rN5I58K!5 &1}n\ y'!h3ݎJ<8Ҭ>`4=LR̲XĚ$ &<&u[ljF<6hNXH#R4 T 4y@aЗ./hA8g.u BB"%SG;nBB)/RW%K]n嚺ځEK$yE/\DMۀQ`%{Uu!!pm-Sc"dk3b~v_sDʄEOߒݚDn{JFTa H< rmf&Ki-_f!NѴWsFBt%K´=5*\0gU0dQcpș^n lflL׼rԀ!&lyvl_5f VDTTvbwTV÷x'\= k= tp?w'hT.Sb ϡ:>vny$go Yj׌U+w 9C MSG(jc!}r żsx<l+5ԍ}Q/GcK޹Kyup`K3D)>⮴=dx~S\*4re^ti2#(8 doMHkׁt!]ixf^xy82LȡMht{=eqB6S)O["mcIuuB(hC$-ID;=QZ[9'4`z}pk#_=r"ҋpVR=Q/hojkO}2T`Gze3!||ؖ՗TH 9DaXeVJ4KM4Ad )(1*RRhLݲSw5 mOE! tS]hO¥#ҖKBu\.m\d'([kprKmmj|'dRq_?zwlMZ+)~?FJ:`7{婚>%e^m-+u-O橐?$_ eo?|h@UVOEsQNJ|yQ%ݠ(xw C)RN(IriiȌ3qZ^LyAR CREH܊0Eg|/EtOE!fJ!@h|'K!| #>$!UW'nI5wcOpuiO= <Hܞt FCؓ.@n@| (ŠG0R,$b3 >odee離r;Ȣтhᤲ'2= [T/b'lS1OlbTtr 7}D/R@mHX9y&ۇxy>Yg<6]z\Pj;K=0=ԝޫbl=0僂PsNg803l=hlP`&x(CP9P3:<,scz@!,r5퇠5#Ú_0AωDU?}m#ʙh09z1A8F՟2k=Ωxx$ax4bͺL뿲0=#y VviqqaMlmQxqhj4mwF=j !o :_Nʯ?wىw\,~LԤL?#AbվfZ~_3^Ūп9w ϡ'5R %5,:Bk ~Z5k Ĵ'P]`x]Cn0c2$f׸[ ]:CO^=kN<@L`xM`<ʃkkZ\+'ĈaۏfiQ!:<䔳px^EOPҏj~1,] FX]кL Jun qĨsO?jrf7ԮQ[Pv,V-5=+ԺaܑAR$.!Ck&B'I5WRViL*ʋsrTwt9GTvK,+ b#9yh %˨&i,4Uy17$cޣ3'Xq~snO չ<}Vjެ& f+fX앗:]?N:5y=<.ZropBQqAhpt 2XbyP "Z2xC`C7ݔF>Q2ϾpXWo&A慵9+/7?杯оVw׷?4`V&.PTjgS 7>_l_б O^e3ՓlS:S(b~=y7+JS_iq6CDDPOL>0lчlQw]z,^2}oVwc'O"JA!`& ݔ>|c3PBry(O9ŒR:2E;NCN u0[ݾ(e5ncFy Ar4 )פRxڏGɢ[|]oJ)ڧtd^흥[!5ϴnQ< L5gmwXB*FF3H )Ey*59‡3,v tC),3qЁh8e)$Tg ̛<8i a!R1+JAM&"c17?c ,/ IFN&!g;aH. O SfBsc2. %fʤ|FqX\ [$^gsBfLbM=˽eZGn >C(&92bq5Fhjbk (>j?*MjjeyCx4%튌m(/,*4=GRP%0KkmnGE{wdޝvg&ݩ*dr]A-KiPǬrAS$t ~#͖l=1&.;2(t4m9c(^ORRIR$Xj4MFMآ&3^}@[9a)6$2J8 [.u2g9=LLuja`8>Q#Jy(*C? F麴59i)R># *Z!T3N%j/Clw+5Ud6:GbtE",b0nz'nݏWio*z8x9L}بJZuyh!PcpK+)ncϹftIzOu܇YxWK{lcj_<.OV\{_*'﵄:!H i@ںRc`~:Q W աoGCbq uQCCކ[Vkqh VP>ir\{Me0J}jӶG)0|zgR Tt;mVK 6Mզg-v/mZfYXuM6 UF^ 6iM^SJ:cɟfù_|\d]/.^ٔUW m r`q_v˝oPPLj INt񇜏gH?s3ُp+ͽC 3w5 ?5x8ۯ/Wc}R4sPxS?co>'ד-x]oY`|7{Џ1Իjev//i?O?,3`=^`HYQ 𼒑NfB 9%2NH)9M*LT* Kfi%1rE" (iݙ lڸ+hjwI[V:x>X `e`f39 3j>R*%4anNRdvmF݂=xДolrۼ @(ɯ,t ^A0Mo&*zg'ThP9t?a_ +µuqM }q= 2f{pckW#eʍިI4 #BիF8ⰷ…"ht$kT`{BcP/'" X60h*(r;$s@O2ݰ0 N"+oAWޥ-kbt /e+md#R {Ƴbc*ek|%i)|m]t@ld*W*JibLDq%vJ< ;GdsERcRˣ|Z}ep"=eJΔG@Ӫ5$aBfāQ",qpv(=PJ{(ez_;r(eDn& j21L, UΒ̯^\\~4KXKw]»}^Kԇp\^7 ( Ԩ-.n ibf?:MN!~C6SZ1~E͠'B*lcum̭@)]Tkn-k~4 T/#o2ȼMv&Y2(T.`9Lccz;.YXq6Y_` )-R;JKZ@JR4"Rj@J8 nE("X).u;)Pָfxu8O Ic,yvk\!TϾ!rBͳnp4*vH>Q8]gV dS] ¥_WW ׶laFjӉv0`7~v%|=a.rc#.o;fxffyx۩wvVbk@;ml~h~8..)aO3Zqb=bk T 'qOm/Aa|b {l ֱjv)E`mLX\b?&0GM6HAFluw֋~ZQTb8jNF&Wsx[}\ͼ֘|d Gt{Y#hm"o¤rޙqnXCe])ǟ )mV:zat45yIT>#UN=8֯lJCT)89pSs6Nj0x(4w @ݬ6+WݜT'?+WG`!I ϮYQK|]ZaU]fjJ j* w:3N{m߄khN FI`2+#I|ᵣ(ZK*xO lmӛGzGK%bgXR].cX "Z-Hcƣ TqՍ:JͯnQ²uVvV0VY$Ί",hF3<՜zq,[-lu[ DJHӠmA:0A:LN0N1XN:jF~ls 7[eiy9GI9W~ugNp+CIf)nk6u,xw^t~ȓgv()r!\]ϡ;4_u^}Wda2 <<,O ͤT 3"wal _Ȼ2ٰ7. crr9U~gnQ6ؑ䈳8 wŠľw;\Щ&GnтVnMX37"sލsy-*:ty&Tؠݢݚgn16ʕ:g:wqϻI"vŠRZ@Vt4VʙiM4Ʀ$;9{MnzX N;x[N%=^-hUքuL:!)-$kځoĬ~C~6 fpcwHB& ??/_~[~}5'E3ŏ7>3?vL ^OO v'e_n`Byt}~fY~{ЏP=Ȧוu7o6|zogÃu80C7FJFzn7t_+R8"'Rr:-)~fX8$%1rE"6fTMg> lڸ+hjm d Mo-/,P"Ho=P>dh EUl_ϱJ ź0ذM"N.zt:(P2|?[5%X!zJF [,1Q>aQ|Fe0sO3MT-Bu/I˖r1掎9Ppx匔*}&: Y澐Z;ǂ&~JyL^-P$)DR ,gWSӘ C8XfV"S6/pI汔푟(IrD'BrC*3H %PA>z毷T 16]J1d!.' /w|x+X=qi~6kuR&49Zh3Jz-D(\jMOu/c'%Bf-Zz@^^E%MM͎h5kp^co'hb[^v[p*_8j';Sr21cXITX&1OS嬑$E&tr+A*K% Ż9Ck/J(ƌ0T6'9qN!8 z8|y\(Ah'kLֱpCB'BBIݦEO -XiѹE(۴!TMB$R 'QSbSkL y43V @ H|me' J $S0!$a8ю1=^&ajc 10h"gm$|K$|D E"B hKTr/S 1Y")8cO=q#.qY!֤0ܢTaiBK⽄>,PIZR>i֧\'?O}A)cJũ2ŊHT[IAIA3RQjϱ W֜=譥 e%8ƃp@''DM[ڦh& +VZ{8팧x }3rħ@a'>X\],C'Tl83Di2]Z7{1f4܈ݺջyDoEQ>ԎŮ:fx7-P+@.Tb>]73+I |c<mq<]$IE 攔xcVhcV_Rb [jbȬ>i((zq˖b4/h@I҆ZhE@5gj5x_(}ȮY[/rQܱ0pf$ȭ1tXc*XMd?*l('~zuE-/nj,5ZQ'a5O&[ZY13ahrfdP3};MF: qI@ ~=8J'+.ǒ*-ӊA:-#(:1*ض#+5't٧f7eew ԯ?~ܾ +HMa4{wzgUQ;k$DJךO[),qp04YM&I` 6DRӔGrƄ^nF,զZU3{?`u1k*AP"QM4[`os s֤^Fn!힩hpʂ#st81Z[m)sbncXvoV~ 4?,^"o0])+;PfT3QX7r$Zd8Sh. eai)3X]YL 9\9Ak"^*-,yz4BJ<6٧cwhm]OM9n@ ֣ʅQuY] *< MġRL^=h F&4sK/bb $Bb5Wg6&,HŴ0Ms*ImDR@JݔWEqnNVU{Vt<U!rx|]vQU&E߈^#7VJ6p$@1(yibf0KHSw 9OtueT.=}%s#a[Qt(0I I? ։(:RrΏfo[ F}bVr*"@!D)*vH i6'jW{rOJIz^i`p"%V3$`le p$Sn#Zzwn6mWw<;R%#qQ;A-Q:BC"eRmWj a[ mm8Mҩ/!.X^9yZH-!]'" l-^&U.dJ -%Y(ղpn}y7-/y2 `҅%me*4&dh',HCd 5%wpmi8ΡVY_kwa_l Y<`ɲf) Xluـ>I tFmxmLaRj Ng@,@9 _[:Ϥ9T l&UX[3@ 6z^)Y) vNhjZ@B; 'Q ̙0X ϳ 9r%x9fn u\(6}dd6x>rg1mio}l6y\pwPV*3Rh,Ͷ_B]2ƝpY 3[" R.qTA_'Bǀh#Ay:di6_f&p";z9^@:G (RQH7RmQm^X;.Î#Fp8tt",pll%XZ^ lmN[5^b(sxP:À2~~3~ 9@zt p$NAtt:$Z BvlҢO6+Y=7DWt=X"4X0b*Yjc !2WI,vR">PgW $%l D6hX~K DX:&ou;cK ,mPljEL2k|ոO9z%F"p\'sYpbZQ ,&S &PY8e1#>msqtqX[H p|vCN!dpW Ewh6JE,2?t!ROn͞%9j(bffn|!d/`?sփk.ڨƭWh DY;OW(בB?`<95֧nӖK](!cYqM_ c$o?zJ\Hj^WztQ3+jTH).\'X25B^rB QS%:3YJU1 )#qT|y5,$AQCĢ!b1J vOt19~QM-e 9&%A<=b1+/yb[<Ӗ9| ,{$}"cnr8}\pm=K{ <ZlX5(Ę_4C@f`--Ej1㜹ߋRǞb J+ 'yZp%מF0,??q?*nV;;)Iz|v*i,nԨVό4ԇ!@>|#($;:3m'Ί܃:E *DFҟƜ"0@E;8sџG+ nKeCr 4Z t؝SvJHKգ K+6Q2g8OȬdH |1lSR-s~5RRw$$BgWؗ }#)R3r~6JIkkzͅ A@0AvM'-V2A6ŕla^H4Nj x ^o.g$2OB{p 1AANv JW@ JX!7T4K8}P22Tvxn7zs*,ϗ+o IÝԛuCPIF1jVߓ>Bl3r |/}nFa4$~D:O[~:rR׎&kOm3?}h7F~UI;԰ǯ~[I%q`K,ojAh]Tbli]oa)smkd"cZ -_OP.K^~xDK,n.@]hgwZ̾rfX\ٸOvzS(B# I$FB|$9"1N zO.ЦSK { .k^S|ύDv2F.I*}}2+\]f2v1 UJFP=5D:q|O.M8("BhM8Dz—LYà{ pKj7>\}&j"] {5F$# 1= TNc` #1p 8 )O! LI?5 SlGMʩ_.})9t4_Dc;i$NyP`Ty`&:p9 '4O'P͔jf ~C$$EbQD'ODAhZ;T,J^SiIvƹGp]YoG+ >.2C K,fƀVB^EM[}dΪ:$+3⋌#+29 $*8*6[BK5\ ki- ;K$eLAM"B EP,/]Ye,3"BN TH?hDDZtٗl Rr v 1% X$ Լ*HN@SXyJ 4wƵr%`58E4a R@6X#""y.;Vˣ8/=%?%JX)9ړ ZgCaV/BD1 Q,0Uu*^~e8DUdY-qlZaZa+oTJ.i8nA,&*jϽgmjZ*/kY˛յphbluQSU׀huz1zńs?Q5 OTB%^eGqcff٦nYRij<&-qgX7<?dz\`:1A݃i`qnA-fhъzvgBOA:G80??mlCB&T-}Nێvv9vKAD W^uݒ n9$䙋hLaLGj7FRyeh ?-h-aW[?Y{7HJ ?[&bJq*hIK6z5q fL5z yj(^ F Sߋ+/d-XCB&T4AU $kZy}KsKBSD;)w(/9؛a\KhHVG-O\ږRoJdJ ϩڕLREQ 08^U&Zs%dx/Qˉs4Jnç=LbAB&T7=Z]&kn<9hVkLhv!!\DdJnRQ̦ FtRh݆"]Z%rH3-2%O1 3_9QӏTsw;^,][T~w5'gԞ>.Du;[ϝBe+\1OA g&>gSçG+%/HH~_G2*`q),"`9T[Q'Ze8# RM2"cC8=V|<`!pSWXqɄFT}f5N[-1%CXlBbɥfjCi ۛFJE``=𮲎=0' J8x ITj [9~AhI0Q5#F33ٍ  >J5:}}M) (UprwHE0S$Y(ez",h-Kw@EqPt8WL.g{. Q̿?O4[bA$hmjKʮl̸0i{4l;+a LbB" 뼦 UU'lN^[A"_BeJNVP]P)ET3Y@Xڅ+˓?-IMQ`92Y-C_6UԮ\~O+JV2 c Bkֻ(++,d@0ޕC4lI*M6 M_.,k"Tow-j%݇5Ǘ}S0z& jz` {8E(NRp?z?\e ; ?bbUz~(@R9 iUjEO(:tx(7H,)ާR Qz)IIUKHIII۟-zY4\jp=H4ɲ v2YZ 4sHnjJ$BܲD VgYBBpNHJ="9("HQA% UHf踊!tC*#8iJ&iZ{iFjXV25CFLZvTrdnt(7qzbauim97a| HRs$g n} G.x{wi*YQw|.؏b?'g{7zqKLT|k70syI4pҊo?mĭ '͒pUK~|}4} QDՓ6f]rm,".zznTfvSp^>TWe2W\\]Ea6-?bQQ^ߙ1L~9_N&z]+LH`~:rIW@J .K# ˫R$-/'1°\Tkz`BR_+sa>/>BCvvqy*s| 2,0{qrͷ<0J3J3%!>,rZ]Zeem _MyA4' T\ N̚ֈ2źq(K 2%;i۽Cͣۗ-?E+CB.`]#":fګ{gs+fus m^^\NX>,/KA5.Nw_vW?XAz1c`/0 z扳:V0"Q=..9A;ib+. t%Y˻l8~zLF@HfV v#֠UG^5hS#.p-o ctlh2BVsrDg1?wŸO!̈ Ukm11S w#mi\}_:S"€&JP L̀UT/\O߿0P,._Ï'ogN/2Ad& pd)KȾiz=\9*9}v·s#$ 9bXaCt^hi%GT,~邢ܯƿs]+\+bпNf`e]ɟf._,EnS"t~ '8pOJP%~R?TIIJ3(ke.; Kn )^ ǽkPrK?wEͻ%._&ogesx~2uWwllJ? ޵$"eg_ ) gvd.Os@6&LfO5)[jIQ80L]UWW5˟fǿ,gvr?dDRUM~>js;?O3gTǴ_7nQ TEj0x\^;.?+"+hK"0LchO$0(e8sHfRRbi1(;N3d$DRt*Vɗfo 4ƌZx|v1sfYKڞUA4yfO[X—߆+F!m83MW|M.)|?k(P& M4ȦJ[~%R\ōq>=>nLJ*^q(W=KBu96-l@w?z ߾5aJ|gTW#"3^Bq ).J67>_᪉0 V/PNYwNˌCƯ)'kb[nI<,F4re9"ɻI0|&c6xyYQ!Ree=z=bHo>x=>e\THw jGhN RY`2OD ('o}zҨWFn},UP LH9Z K _s.MfՉh. FہSLXs=O9)a7?rJp{#or* zX%ݞ)cJ F\cMI< SĘXΎ0q\0v\!LY,UGZI SQ$$J܊$JhHܷ;{#_j4#E=PU/ap!U= O"FۈFL&i""W Ș xÑ{~[ıL<c! QZ~wNTSEՔ_-dU=JD&U MPOvl?mIH ') H׿P3Op_jE?--X}׺AHAcZQtlq`IR,6 0aݻ|l_)R˨RAXXg?g0 i%\9,V !Ʈl0ͭ.fvx9_ON'^$o׫gǂk>wpD1tBspQgО~F.gT~Th){Ti ע3ș*P%LHp7FdDi8QTh B }.ƴ&<+'G!2;Ve;1FKLuh绥prWCK8iL; <*/w8[~I7TuXهɮBmhgNxܟ֨<{sYS JÞj~.o?jat7~,v@|?N'Op1i)m=A6jǿ[qeݘIMƬU |nx ԋ?&<4I&5TM-E,)29lad1B1\cp̌T8  ϐPX*QƤAIt"mL("XEǘN]CtL$'M3ʡdTlcl{C{㍔M GL0Ir,ᔡ,vu`~qJS^)Kc#xƂ,CT_^Eȗ1q0J$Y̢L*01H,&)IbPiLdpfb=ƪHfJ];s5ŖPLܲݗXok/J N]6*/[ܨHg "ĶI:}g!,@b?Y !>s$e N9YY{)]q3+$0cI;5gTx117bT $W`H00U5*q=!F%9$uVcgy96c:)8G|=}d8҄ҶT1jR~E kFJG rlXռ#g mKWI6V}AE3y{gHOm;βEl^:˶3VPQ1U\:hG-Zu맵πS3$iٕWg.jr8`yggWau8Q!8g$ Zοp: H+)x4G3^i򰽧Q }䡾 @#i/:~] bB| m^Ujv! K;7ݎ7 Ӽ^^pyt%AmU,^>w&UTCOh iTrV߹4) W'ݡcmx՝#C` >c>߇-=1WiF/K@ȡ6auf >J(B 裃d 7G KVyr&R#&^IUIh. U+xxf[uk빲IF6Ք*%*.6Քha@?ErRf]VT 5E-ծUz&#Pk aQUF qӁct㻿j-x'ӻ|%B|0Xy p>3Ek}:vp;t]ގ( $-5`lm1O]dA3nө]ڿXv(/[>;*_uMM `!ײ𵽰.KPmjoDO)n8U;b2"u;cd8pl3ω%#h\=*'e-I |-!Xpcb˭ЩI#|%&MZ6 ЇoY4ouF!f3q N5 lLb121J(eZH0#Tc90B$^5PZk^-T{USTE(Ѭ$ٺNJrv{B;nw"LowW)FּZ;UR=1V\SN7*Bը6^0hn (y+MI2)dyyU5&\5U_QӅ;/n ~Wu"re)I0p23@ bHd$Ah/!{_redʽgc+T#1@|ƅ'[c5q?-B 1lg~{eVfgH7D`G"*Xuqy<_W#g .FڢL\-`L?ǚ_&^%f $F(n>R G 0*yCP1`#xɦAaߟF7J(Vwdm״~9 "3"w29@@teq=M8a^D~;'Bq"~'"5=)BR:stO{ZutjGjGjGjGe-J-J48M8po9*Q1eTDý)+Id .] }PPJ7!?f $d*fiG)$s`D0(,S`Yja$ IL3nIj bqviE%}¬MV9R(|aV:BB"R:@=GU 81B(]PCuA#gvW6VX]qW\η''4& i#`bfnh܉ゑL9}ALW a_bXֶue|0s] ّL{~\_c"@eWFAr}ozx2X`a͞B>Si9Pm5)u_ܻ!?gL*{ u8,4&bRH1#JLk&w^ ]=lGەJpA39 "0%os3]^TwyΞUCn6>'%Q{ƝN h 1$@-šSrm&[OrF9^o;!.(bG@2B))UJ}86ϭnc4; m!q?*{tC-ZƓ]^E Jh‚pj|[¤{=mj~0~\|1kL8B#;{h`pT X22Zڏ" Gj*>/{ `QvzCLH"&1hK#PZi(ig /$G2j(d^[=>p>c3 %){"3~a]zuc{*#2r{*#2*穀ceFTqe-2X1ʘdY+knH luC'&^+,~, . jdh}H$.Ւ7>='8"ꄫ/~ݢW#Jɜnjb>f11Px5pN9ÈkOS`TxǰB9A %J>tF#,Q0k,s̱Nl 52#XxrFi(9RǪko\X]O[͞z]{땠uLVt(%罥Cr4gXcTs(V@b#}%wdM5J5Fq41ܱMW0xwfV}e$VLF)gܾ}e8:Wf;`\b5 agv).Ln1eƁf43 뤦Z ,Pf4w NnFSGqr0]Dzs6wQ |G]8u*@'< 'es:! b* [2E=r&GMj]4|K;x6!ApBHP4 N XYvfU4`GS|E!?0 PXvV:nt3iBrỵ ]¬@( @Y00rCgjAeGjTm75) ~aS4Ib7gM`J^ظ msTC;+Tty%|Ct5;dCA4ٚ PAa bd5KS!LmP$j_w| Gr'&QC+sX`YK |P y&BEN%9(r ւZ(̟ZͿ`O*+ 5X|u0U bB29CTҘ7 ɉ׮X[R#)hnLPZsLRM|uK .cm𰜂; $ga1$Xr~!M`$<<3 Zxe\si1N+鳂Xl RK=ss f.V~D552 aى'%)K앉!DKv(5)˜(+rD2EA$$d};5NPU:`ԵUOAs;Rww14Gt$c:z'"H(vAڣ$0uVV[hS$ۊO\9#Fѣu3j;}ѼͶ?fBxU.E6L4`EVE+pZz-#険}6bIE+2hޕ.">ź|yeofK.C 'xڔ+ܿ,#Ⱦ$|2b$VDM8Cn%_^yo*fsI/*T]xݫ uT(:Fz3ɓZU$b; Q8qx!-]{CQAcǪ" 5!@qx)_fބuv=h]|SkMƓ|?"D+nqxtY..>/7o1C7 dp7'w/q>#晝M@%E|P{yQlV8`R!{4.cRL1ɳ% b&ҮGl85?rXKlʂ$MB#Ǜfs^(bvY\@0xOf'ˌPg1$l2n$ #nmT8N2"Թ\9&ajrbN3 VRUzS-m%aCB=Z*_Kjh(ç1{O:tŤF%6_}{:/嫥zi3h09tW\ #@vqQe<[Ve<oLp ŎQ6u4a/kl93g^6W&M/>ip^3Çk\1ICxgoyZqղ!gt3ɞ?$>>$ p`3ܡ_ǓIf9dfeX rҸB6z4q{n٠lA5$ 60_`T~\.l5ǛVڿ&O Vx=p"c80Y^WTr_>l"QK!{߱RGz]icLSAX#V)z vFkᵲW ˳Dzm NlR S*źETI)<ćя!6)ܝi3xXhLfA7h>nWIqEX<ڻ`hW@(kXͺbO^c!(U>}8InyۍHIzq`\cM 31y#q.Ez3"8V-,*x{O:疋x5SC0K_಺_&~_m[GSS6ՔeacV_蜷.0cw_fٍ[]P.ont}yJ[c bUV?ޭm,F)>h:ƊЛr? #XaD ^;F>Uw{ܺihڌNlEanݕwG]EI4䅫u*ѧr^>:hѡ? N'SxêIlƲ8YzrboRư{Ky#K8(&豮K(eMS4jO#Q4%8_]5!qQv&F+E'"4o'Dzy3n>Yg_|2ϣz-ifP6T%sg8d=Eƽd|+?.j1#nO4GTɴ6cZK28*eۧ*Ʃ8 Bޥ! jdBis¨NA8t^NiaФ;ĥToեMMUu} HWFo+ 77&Rʋl2-JSx2žJ;n&9H 'r,҂X֏k:rNTkR5M1H)j:Fcj.WAt]|I lCC^>^bSvp:̈!eVG |]H[RKx_O'TnRY: +$g,~a}MD"KF4`ato\Y]?xG0vf>_^_?BU![H(T6}ײ(g{t/dKѦ?'ͫ8U|xR7`P!Q.'J* 3gs f$rh*ژ4:;/=>2gnhGWmB8D۽hl@F?Ş#Hmk"G>N\ctWZ4Kse9#MRWdt"(in1#(#{~ q}.V{- JCu-)s3mC ˘W2 _"(e"F;td}$zIbd}b<ܪw|IT'?' b'[ ^PU Rkܑ _].& ՏJf=${uZZ`(J}fYHPD:'ɖEDu#+cY5e'@mvD)++N#3n1).1SZXzt] "8O. j'xƯ_fiFVyx0a($wEFHoi1qX },+ 6|h:Лh2YRvA7ѪY܄hʓڂڢ/yߨC$r/~L-{ީc;#W0Vzmuo}.lʾtkk+V@Hﮕ}YPQt`>Oq;+.",a)޻"E8 ٻdWyKl fhV GB)A Tpc9VbϨC$kt[%bw AKz`cKBF]d ~FY+E\._Y+9e&$[ G/-U@ƀ2^C¢/J=$/Jef/JpTBJ;T^x{QB yq^f7[ 6kRz/ [ϖgԪ([XS-+FlWwX5F+\|>n7TS3 Wnw]p+9I42LIS4 A уU()'͔`bHpA2 2B@Z}^*2@i iI|dPU^q!>_, yIH i((ؠ5 &@<@ˌVHy54G:s)C 8 yM|DP+)8y U TM6t%Si_: `w(kS'wBjq&5,(~IX2! FeL$P3zcQ;KSĂ@)y8wT6zKлuieDY%ԊgRXI`K854*|fuZc ZրDSu\)&>-`Qj!8_M%fZ3!B ^7/h\+#Jk͉!`*f> 6wd \ 9BPRpXA~6kx s|6U5rf~) |M_As7` _ȑ T \H KIʣ((ER2?,ߺŤD;q1Xͼ|FӃd(:"jB4E HitrB0ch%r/Db%1e/X? QSzC8+CA`upHF<= $r (^ǰB^o8T ŒS*Fc"Dz/"J ox0 z|>K`zH<ϑx 22sˢ6Y)28a*쩠}[^iዐ7I =XsQ-^wR2K#h9 #϶_`VW^'ʦKc<EpSLtQi&9MI9fD+$V.ĩNW+|NRhcI$}e,+_$:h`.gЇj=7Ϧ0/ /qyOV.-p&魃AgT eʼ)|ȬH:feukV`ŃfpKu6˙T4=dN&iU6U7 M;)wٺd݃Dwk0ֶ߯`;[楐n+eyӫA2MV͓ f ŵ5u2'nW˩C惡Wja?ut_kq W3.?.MIZ Tw  Y$e"Ds'4g+9 (s`@Z TI X7YfKLֽĘ 5?c#fE c2Eg(/)|2Æj<ϘXfpe""/d W`̃^•h~2yLգW >>66췛Mէ‚KC`VjMާf!!N?N!hVVZxM})_̉o~)7JM²<1AX&ֿwu g&e8ufRo VڊX\>%=+󹨹M%_صJ!}ਗ0_:4>‹1jW'<+d^v4:P L_˛ŗdugnFUϻ_n[[%/l1.7&?=8/wgoO`pdg<şFgX}'}svOÜm3m朤ȣsx|q"eߦ`q[p:'oa$HۓpP~U{+چ\7F,6 ­Ij-+no.5RocV[Ocv&{cɶFhs⍾dF_g@TS-Nn7cڅћѯtGk~ȥ{!! zd)"'8j1A w/]rލ}$ mvUC$nmFWP 8'(nsǮ\ўF}cGY&}URPjGt p%|U9dIIK cJR.ax[JhLStK$y_ʒ D-0&y)?ƼMnj.HQ91o9G!ŕar|ֲ<&f=+JYX.rM)W*.EW&c̯rvSZB)YcUT4(-U 4n:9$X4f<oS4&L!O{a[?x!|MFaVP$J*,EyVѬ)].6иiVh@~U./9=ڿ" T'@ ݫ_ߟdžE:A aWiǰRkvjm~bb9fDl + >aǨ3Պ1F!_5Q#/儒 ZXzǠ˦lwrY} {F& 0՝["8]rȾ9L s%h<]N|̿{ID,vɼw{Tۛ0G{yD|yR˼ ezFDDѬ:ISr4}}Mf{$HNCe/@DŽ"dmΩ@ :I=9:Kh!mt'M _Fu.CCgv|u׬_vOPwSxj~[ 7l; `q/n{ yrJJʈ, a91D@dFw&tT\8sjn&{ @^z2nԹZ1GNQ2H7i-$ oRt<86g97 ~:_,>? 7ۜl_7?gO8`}{֬ |=+¯cu9ixc:!00 SbQj!$fZ3!jJP.BHF-0!WJ29ϴj%I+({mWbK"G)I"A$-^yb #<Y!]-k B$H`AB;G@ll$N26Lj  j. D 1gU`qVcgpX˰("\ ){ڝ,~iwq?ؓ]'%U)u]5N a@ mgǍP7g6Lda#K klAKi(;WG]ۙv(tv"jbEŊ]]J\]* .I}ib]Hr9L80uvkkI.֊_] ŹD]]LjSDѵ1|v1 WkS$nS: 38e$Tb! R(#LxJ UK+;IIbV)]]L1QjSŘ㫵 Wk>v1NHk>vҺcj,Wۻ6Feʸldb|yx7w&?PnʏNdLs_c1 %hBsRw +);}ӝ>ލswPiZpteӭ&lT5]v9b6]g߯K:YVKm V-YcI5P|_ldcMh?E^Cy^G{Zhp#I+n:WZ- @*]6D([(=md$}~/&~)%_ Z\m썍foBaB42l|:TPԬS=f`HP\) W1y0`/u#: $~DPe`ޖ-`v ):H~ :0~(V"抈=Fkh!//}/~Nιzf^+ +:v%hC\HS{ ZP_f}+WI؜L|slkA=@7:d!"e ɢ˄1^vmr&`JCFcl[LT -d0>qxMBZ5grc \埁^kB1љ9ZN (AjcA 04`{8星DǏ/,>3km)䘅Tr̲ܣ194An]9=19F5aGcB6y=$nQ1+$=m17 t۳2Ǭm17 b2Ǭm17 1y|9fX星DYm#1kucns̍jbrscnP]]#icns̍jr= m)9&1üIs s cF6昛)=3'6Ɏ/)c6r*5r0͗q8e/OZ]qXe0d߼geO! @nrs ϙ: gNhP*[CW^`lĚ 4%Rŀ5ĨY'H *RV@ i8 IEYK|v3xW+&^o;/Ԥ.{l~Bp9e`)l.DHjaH KX$IxQ!.5kYB8O_]c~n5a]}6?ΰ8)E3H㲹$P+RVhqwtɫօTr$/=h> [ ^1NBP(:a8R&hvm܏B  Q0WN *JF"EʺFY s2PL]Vc傼00ƹ0Mh |P;X朡o {%,d`-SeddQa)&k1!Fg&!-Q ًy6KEv3^^Iݠ߃ߺL/x9B-~^ek8&."-d>T땙|i0YS3 #Kq؂0Ul5pmn)opV,v.pӐ<.;8{ Ԟk qܙ$S???O\L!Φ}O&_\Iz\LϓȦnݼ`3n%1{lw7r;_N#KG.(k]Mŷ)}ҹvp7d_c͵@I=k3 yA@X SI'/%L ^[CLgYEżSDܼɑtl\5HFI B0۠01"`FZ."R-XFe1‘+e[<c^R*+6$3$vFTkQ<5O= )3y.Uӿyw<>D}>Uz]2R]yV#v ^ [Ul!Ę15T*q[jJ)%Qd1upJ煲j0YEXW)Ŕh`%4nN+N&FCH8%q86x0F{A<ٹF6~#*&]Z2L n1mV"E=0wn};<նmî xR4{ަ{Wkv* 43~ܑی% ^ͦa<Qps,-׊Sj?f?@пқ`v33` NB}7ٻN mi=P/xːYѻ:`ɮ Vtzr|7_%* 5E]st{ӽkν˿ƳÈSZ=3Q68{KvN* ;_5*ԢrbyGM\/DLVd;Zc`bjUZEIuص:Ֆga:HȨ򑸐vjQjL.ET-CQp+na B "n.*~}HCJ]PCrY 2!fj.`~b:KQx5wW Y#.&:.us~^~&L&ː"&]2tM:=)srvP G%ntHA5׽sp^g(x Kko"|p2*Bᚥ~N6pX9qk2:u*DC|;֏j]^Z@Mف!^+?'vh%q5N2vQ?Z.Bìzhܛ̥jBзgFv.5"UbG\l^9 mpopտ}r[^΃q"} _rc[HJCMJzTDDRY,@T&X$P}P15RMR<%%I-E:`I (d; \XE9!kpĖ[``V fʟR\W֔?o YVJlYiZR-ចERIF%1,Vc"( E EX|E6@#D|UtEuaKgYLڭ<2`ιp+MNGd3:-UrԚf#l1_6!Qbp{ Wo0^9<@Z-<>9zg|J н, rP[[HdU]uEڋ~>{97cG=W%1z,FvIlQ|P|6kR/IvN)>4|]|w].޺c,FHՖoJ&8tEIŞL|2fH+vkj Nc7-5h[Ul)Ț%ގZƴ"/h Bohk[\2"9͵2|뒬拿F-eiNobfKsӹgZOr~H%y(Ne-9{7K)Շ4 |kӭ4jpZvrvS2U}.~1Ě "zW Q1[>Og#rgn\Gt$Gzd8pDJvL1e!wV~fQ}e$v1 $UXZ,_T˽p,Uwg O[v]gk(ʥ8~EH??+4k,KB8F߳F@vϛ^,ikc|Iwr0 {~ßWbz,Yӳ.atp 0xN Y9!J)RL:(,3F`s_`boOY= xYraQU@mZ3KC`Wb9ft¸@ޕc")31頗;/ ,.ndˑ3$Y%dTҍ^óm+KAJIo HRL`oJ鰡O.HcEdF9#a!x+`f9:'Tې#5o"e܂)U Bn׫2ܬ2F|0̒! TWUH_6leLEǢ'2D XR}&, ;bi:sGq!"j$7j:[˗|"UŅŔ~ntإ(o6y"We[ʅ-o=NxaLb5/iQ!*vH]\/2=׵)& آ7eT7 %[R1+&i8B7MVތժG;5CݛR-yԛ:R'M 7@]&eM6'Z$c"Hg\v%cvA0,ˎFsPތ"f!0ikؙl3N:lziHֻʫgmì|~$ cܟ@O00bI)Ed&¹ pjlR޷ЖBE S@ kC)ZB 2hǕP:#*/d } z)oμ{<+ *i;`hg6#8Gsϸ,cBf a,!n3OW)/X>yOןH:Kb;^7MxsxARfMva}LNV권n˼~`7<+[&eN37Dzv > mjNfJ :4}eVr*'<7C;#|}IbzOꘅŭCGg*igkj;w0 QgͪAk x, f9}-y A-"LȎf c0,&%Ӏ ]Hi\pT5܎Lb=GO$OBTvE$=A[3Ub e֥[vS=woK9ttɅ"$s89DV9ɍ[gJXƔʰ]-(&޿x7E2Du *9a yLP|^(H9H3iq7@`~DcIiƞr(Xv\Ÿ*=W1*=e y"J87 Iȫ͡9Ur,L2zX)b٦/jtlӗ-ǔrչޛc/.sɉB/TyDL0EtuxюML*n`7SaA)pY<1܇@1S{P6D).)+Q),72%4I5)2v&@8`5PSlhʻ哇wܔ#)Sna"C0X9taKc@gcG }0M.j2Zu!6*#bHb÷8+!M[O&jSeuDBX8ZW[]XѦQǨDQ,r: RRpx1 f&(G,VFZ Fuc0+1O_A!, h' xf\tZ0̾%nZESʀ#%X*q^4ȁJ `],É(RG׌o1Yl0(9*jS̊DDTTX +]ebpF߈%(Q b9=<.P)gۊ6!Ì rJjq<&|qч% pC_ )̗ZGGI%%|Zx`),Glڤx S§Ot~k* [5j8wLR`Ɂ>gI'5b^=yw?8aq8q :h+ Rn Mhؤ`t$HbL RXP^sNGh++7)N@qK=8o#1i…8{d1=c3E}yg0QC<4m) @>b!(@\` B @cYiEbSBB@4cqA  "~j<V<[:*`;8?_6@Uv,z$!yvX5@SDRQ;bw!}-v!Z@Yt9v^` &X*i`XTG3 <ù26Q3-\NR<(x„r8Xb"C% )3#ᄈ^7uyj )Ou{&z␳SXopnj ]ӫ#bTkubMswq=vu,?/Hb~3vV=` \DraÇ͗k{~!4 2N_Ӵ n5smE`"d!MFe(HC8KR^_yE̴HmTZ-Sjr vX+ >}|WF$c~z-8sv* ?~ڪ=Y*m $T z/4QݔkxZ/#%~_FJ!} IO_5F6*N8d*Orr in9{ֹ˙gPMQX## Jkk >"U.e'iF5F|Dll}Tnn26ds+#F:nv׭?V>xp.L,AV<Ѐ$,b4Fa9fgșL1gͰמrL1WI'`-e<=5X̛,ѝYGY/G1oڮ/>@HUnMa.V6먠= ۛ./K`D& ڔntB O]">u+^k:5S> 93ț\)D9+7$Ds.s/mEtghuT U@@pC' hɯ,οV/p~z/wޮ:Tki7vU]WAbѪl"/ŭH5:Hs~Y2bx},,N{~oDq)zS5ģ NqѢ1.ϴ%,75ϺH)86Y;LO 5pXBL.΀.6CهO<R*ZZӲwR ¿,,c L,';Y0-e4@bnqlǼ}_rdnot/YU%쳏#\ϑ<2 @6Lkww;dP,ݔ@d81Ji2a;6:qR^r2ǸuR01r) G5ui9huC/_2큵AR.aAa@Tک. 4ꌑQ[;>8XD6 JvG"5WBOz,2IAUw>۽/6S%ga=i0`ε39*{MZCCN!Ai{rFj^vs{1us۬S%W*=HI*5HjDQHJxFݍFgiĬ!GF =j Y$wb/ʎK rݰ2+$ʅFr2 2,p.ZiKF ]7R<ۜÙ6Fh ca߭BW@;Q]f:No/Y6m܎ގ~?Qng;Gw1Ͽ_Wnb(w tvV6k},~+(nWrWxKTU5mڏL*)0#>yCpGUtC TQҕJCR/x˦,.6ԟدܷA̵Ac]L/cߔb:ng٣W`Fr N0~WKr0):3S}+]| fts}}BY-_|\ºPŷmϱlo 4"Z* 9-҄&4ĥ '4aω8!JpVTF [J3#P^ °X _1;QS- L f-|9'F]f )#0fE1SXQ2KĆb 3e23 47H-yxŸ]hV@F?=!P5RO1鵦hVZkXƦή<>W(.mv\E.bŃWۀOJ4ŹV"a[.58W\(s7arD/܁TBl@p`5JBЈ 6S1f (RrC^MB4mecfcƵh l)RQ! 8FD20bq@ E v,հL*$2e bT([4#MfM!jIDLzڹw c&CeY6Xh A)Et<kWg+%ѹ!pL +wuOv:OSפJVۤ:F?= _)>,yK2`@|V*;nj_\u 4s,Ag}wd:58 qFاonEJ6Ŗq֛t6_T =3=xÚ$`(MqרZ l'`9 I/ 14&pP.V; KAzDX`L@)Ax; {o-r G0tn;GT$f/mYE@;03^3{Pl!6bo{˗ Z/0zw BBe}Oeu. ge.8 gde۔N_UOo/꫺b"`/r.i%ݏwU1֯,1K^^{O׳juz?3 .GF#򉮐6LJOYɱ$Hju $_gs{fυpHR%0%cnϛD_̿Eq-}{N1+_\Ƽ]$LUC "hTJeN@70_3&Bt3ð;Q']E> x*w8JA En~K1ݽqG㤵7>y:v h뽐w%Î,%48'uiLU߻dϟW.'y< c3pt л&e2O hHd1F&.t;(,5)y@P·+}#3^B2ǫ.Gd@ӛN˝6ko[kP\0z<_X$Qj P{罊Rh.xǨgٿD2;˾%a 3nHPyPgZSɈK`$X}ԅaRe%o{ ʮEEa{.=gAӱIo62%=m'SJqgzIj.M&aȪDgEhe܊F6;5%f@exsL;;l<%LC#QyTгvʌݸ̂.i| URCf=q?;o~DǨ&OTsRbH)|u ׭rٸe=4"a?q KO\~L7}1$-RM.(T4E ^00d8%dZ81Z ;QokG#u᫯>}u|b~Z-á1 Q)kO8qp*B l+R:OZZ- tNe:b'oo=6,yE2m6C]Huj nJqB;z!]>|ֻTPYFȢ4P̊/5 !Ef% ڔE(5NCvMx(9 =- )?z$.^cX0na)j4Ƅ9,/z$Ȅ26L6&c 9%8ž葥! CdGjSvӘ)9kwOTB`:t_C,ӪU%(Ps] }IG`kLs'v@.`D!7*+wN< ,ZeH-5 "6rs|4(Y}qfkal֊ m|=xYbh0o}ؼe#f|4LWE$>QÒVSwM$-! x>=7pDZf5maAa|y l.7&э7JBµymׁ~ЂJո0Xs[0hJONo 3&A TMDY˭-%f j!W(a!QXnၲ!*~lqo%EѕះeL=Yr}a!yߦ"T 04ԅRs& |nS-YUAB.ގ~?l0YoZf?FNj%_wQ veDŽK7x~og ͏9LqCv( ybnuM/xy⣟J{!Mu!\Et[nLcn|9z}',#iG0f 8$WFېgf]k)HzWݍBh%YmNT[#D@%*eÉRV郟[Tr`޻|ֱ_TxφnN{`|5p]3iԓๆS&]f,f6 q]Xx)bXނvެ*Q ]*?鹺#z\!gr%y "] *LT9L &ѿ?1†c֑k"{kqWа1Ϫ@ E&Le,aL$UN()p+UTd 9ن|.7a(?Q>%?_cn ːO%5ׇK+3 HD)2y" Az/z< IdKd%Րhq6 -G=DBp"G]'-bsJӊsYJb ~u 炓'0wz^}!2Oq**}msOiV)(wCݺiom<[E|xp:l-BX?,~Tj'N0>(14OqQ?JGܵ^ϝ~7\,hYp;x$W0y,~qng_6|V&.l%j~O-kmW 53|XVTBmv?FC v,n@ܻ!ֹ!QSƾ^Da.@PM`":ҁL׌12P:uIn.BUډqZP9m9 ޶ h 㱔Gdf^PJ$GpYs*>"(Rfېe>|㑭 5!]o孌W >Mp+(Vfe~3,\fpD Yc2okV>letokxOOƸKsZ+|/6;1kqn>gDv:)t>B­&C)٫iF\z|M m.;-[y}b拵7^󩄺EfIN7_|Vb2),4M\40М<#9V1#p(DwL0 Y`KKM0FyX# ]h 蛧pwt]{ӪK4*-Z -f"` +)(I)+촋q!i6fddle+CM:;t-릖2BOh)E ChM 7 zӝӥGh@zyD?>ݟ͜sx[|>i_C߾彜ό=i4'pq!\yI_gV;[l] 7ggl?w Ni_tzz}; ~Nu+s[eS4 H;cø:&$e&N(c 0E1*0Z,>碣W*9)YGKI,ʖ녙C%Z βI"I%ciHNQD֪ )[ZPbXxm0Bƌ5@v~>*+QV ã" *DK6pb ɘXڳ3`Z,1uTf?R}ґ,Õ d6I@FIe ijM̙=L((~BIg硞D_.,+ syćaFbsM :8IDNYsW ڨt2ڊdNb fxSѮ*Q"LRrFeА%Ҩ%ޮ5 3Br[~}@dq[0g6 `'6ؐJVG&}d_\SNNˬ9Wα=9Z5-*㙭.B =ASPȡ+34^KRDq:`vo9q}[[9`7Uj%-L~fV.Ιkf\٬IU,4Oibph[c;+pU^ie[P:=۰SPRs7}@([]/~c >?bw3}{*D 8~nM렳]}?聸Ok8>PrѮ>hvgT_nazVn&A[H$SXoKu\Zq?IqU8ZV7!+]y)},#`RU (,X5ABEI'̖U_^X'K| -AGoUJZeHrI@dpJ䘝,6 *Ȕ6h+|4fzBv6V&^0]qԿ`Z8F\fII!3b'(I?^4@.AƨM% H~/_v1bBc_KBArRf.94هRȔ 5&F dO#c$d" T2ywzAvk?OƑNCHP5h Է7z ?T5R={`k'MՇQm|Y{O9.j/^T;}'GNg7N^ܰ<_N""Mcic#z$\?/QRjEƶj#o^~ñ#"symZ m]yZ*C?Ә~1gD];[+_x|&$It~HF኏~!?A9:/cz)1xuCV+{|w"̈́\$}^EZzզϘhCJas[nb|EVyg+|~qόeU}I~~A zk6(Qy>3V<>T*1}?ef_,dv>5$hm~-\ޜ ߾ =| !ֽ?t Z‡g|qGq WնĀ5b(o_>v |Y$^ "!pۺ2iix\vy;d]l'r &E rpW4 ׻/w{kـD7FPFo6<%Kan7,0VLnB;r/:w>Kʎ$M:kH7ptn8Skla/G/" 0-MרQh)޺Bݺh½f XYce?Yݝ/zJA@;4)Qx,H~ @#R4xٽ; 7Kѱ@N4msRӄҲ[Ia /𩎇`JN4ʺ?sr")s)"3Qk/m &Q[4 UpS *)W+^Y+:Zx?ZkoSC.3_$kyywڊh~vP$\gVT0,xLV+A] V2SV,+4jbUF^3vݘuKFS1, ?e&m3M ^F7:c2Sz`fJ[GaJw;`V)@xonoHYo-5& Rb̸xZG{-378c!ѴyU7׾]|-my^*͎ȇ5T(*ywGÈv5DH nW6D_ƽw,(߀1C uX\'"*.KDPLP'lpAצNT/J.P.˱(&JY_^f5,S:Y}[]|!\Ո }͏蝏;HW?bco1[H omh\(A5Vnq<c0ًoGDNfӌk-rI|~j#}4L #a5 zӂz7at!ݿ0{k8%eJΡS< Sb0‚Pe;־@HkwFKaݹ',kBm9I>_Ls{PU>]%8.ye Xr] sN,^=ļ/̶50s]+YwVsQOJ{Ὣ|ye~c|Jk!imtlcX rkG^L|}sƔXPxYEdzOYS~eM&l,kNݱ˺2Ou=o.À)3_;oEPmS_O•zbu]"lFlxڈ28TguCmo/Tj V2u}. k1܅D%F;9!Ťx6{?b'@DP|yR}HVt GƩ4;(iFLe]Mrܠb!):(dE'1ʜqCݦv VJZGXPB,Šd* ;z 9$ÞD?&`u&vP9b.֌Հ8l<@X3>kw Nigm@t3;+1mӀOǴbb{TIeĠ <ɾ짇ךiʿ7ifvEI;[ǀe۬!f/vmE0뵠?OnϮ 2LFpxDs;q?uw!E*ȆEkI]RN×ly ]v) ЮOFIIqm^Dz:Y [ +Sw`;ݞMOم7;yV`U؄䝖%KD"=hY}͕Zzc7-ZIK1E0"BPGwHF&<$3FF!-B(%3$[I_RpO:9eS* y[(*o*}CJI  )KFvm^P )}N  tLNHޒrkI lTuJi`LSN D<BƓA%ll3 [LXQĸ2}Y2 SN'V3_kf6јX""4z#tIerct?,>xq( +t=ʁR`U Dhwh^4ْ=?wN9浄f\:hL9aʂCZhg韇1eEk"s~1z˛j {VN-#efi?7na]nu?ҁlZ}'pq c,f: /w7V/t]PZ>WзNӟLuKB ).^y}IцLEs1a5DPJS`@WS՘"Ĥ D pa b)H;srfH+"HJnjphq-ƒ!] &*A`%75A ͬHK.+hSb .IEPf_2TyŬ}4.,wL\]Ӳ G 2|{ᓵS2[xªϯժb)v?+^\_; a7ouPζpeŔáFq㥔x%iO8zJtEFɀ-,Zς=Woc7RP $ckXRp8C4pP zG$0tn:YS{` d-.#p+A;x̒Tg;ZM8ڳtnpF{xq@С.-U g`ti*GC? \T}-REA\S6@ȕ|nVK #Xq5aZui9i(`ź0\;!ȐJjU Sx<84B=i(u1̦Ez¤F?_."BYMDYž)<6xVr0q|]K^4It2v:M ^;Æ"8e"䜭Eiu%5zkGq$>pQ_.ԔQp5"/ŎT8BEZF\4ۛ$.c "i}9XaX,+\QMQaqL#3LG+H#QҨTfԄ+?JR(oC\j,Bv; nHDD_Pk# `~ղraҸ?qhM[-r SZOd'~l=Hqo"k'zV ,*.ʁV`{v0}UÍZj5 ;X[U5S)~*tksڋ|tӃsq4谐Qc'/ HG 0 $t^HR V  G{*Uܠu r:wU)Y0)'WӋ3_ijs5 sw\ړaя}[,P~G7w?9`e>},&r|V'@q(r4'kQ(Bx\7常qgkY]g"u|-{׭=ŚZc=*k;fRʱbR &ˍ07/'w9_Zμ5y8l_w>, ;VLP!!u:!I} 91kkP X쟒}ٺP !afRƳKr9MRYkZK-8TCr1kUk)t#Afb)Od.$䍋hklQ}ƬI-Gf-#źRn}"[E4GXɡ:țMjѺb#:cXmzZ n]HцLE^4d} @U{OUyöOLj]t?`hD q㢏 LB MA5x?8 Ak\5 [xKzξDdsdj?hChyrpI{(MWv"JE W a miJKL.>9nZZF=<\srhG=/ۧUs܇L 3H}؛ doϯ̵X$1]\p/×9Yd-_.#? )^ {~V5J~}'pGZeD&Kx*XLP>tY w< Ԫ !qw'/|nB#[^;eLz9ۆ5ia_6!Õq3^)k*tFnf.xH3fԃ@`R1u{Q~x鑵 cmX:fG ^V1<xx\[! |[:ܫ (ϷۏR;E0F]q<j !/4E";`4aFP/ dM4ivjX_G?f݁X]^D,NѨR*"P:Z-,j3;0oUl:EB_ڷ^'w2Mk$QnN8)S=xkHa:ڱį_b1]%"ԒBk"rk"T @:TP pB΃_.{iPԹ2udiK'TuJflﯧ-J~:eQ@D{$ wp9Z sm6[xP#Tv\k9(PV?bw't54f_:H) FvJ%eX|Kf 'I1P2ěӊxbwW< HbGcH7P:XK?I<JMĜ ;4_ώiku$8 X)?_I%ݨFkFI)3N26PږN g# D@s3x*pePYoe8?fY/]EvT/ILP OU Gu?v`j߆W5 GfvlP(pf"8w'OUے޳RbNJ ի ?ᗾ[ s"[J+/K5bN Se1t;1Ā0YgU87dF'DjkA#*ߞzSyuy OEUkQMx/oU|W'/BTPcgcXEwm~Iʼ_LfE`/QUdYj˲,ɗE{KU%Xzt5++8`o(IMDψiJ.lfD&rƅ.,:b΅ƿ6nD_-phlcd7qTfܖ4N)qFe3e "JYHA-U. {[XGw }kX*ê'+8Y^4ՠoY@X\n|KSx|׎#M_ScP|'%Z5H@KjsöPADIƇL0s̈~$iS4Ճ*r+~?lpf0!2A {+ _+ު7N01ֳ+#HШI ;XZq3ˬqL3KVdv"݌OښT봶F?^Zjrڰpd[q2꧓xx=%`TRL_X4{$7P]j[r89]ՠ "H9Ih%%J|lߟsX#;;q3y#LJOI]rJC4`hBm@qN^}NJ^[.0vTԖk?Z)k|~RȬ5ͧµ6?z9qP'0_erIp>-ᡠ5ה 6JiZ8Bbir2ߚf%SC_SUVzTK{3%;@ ѡ6v/?;/IMLw>kƌTAj`V1W8=aŁ?` +Xbע<埪4UŚik\}ڏhV;/?޻E%FaN+u7&}j?6eJÕ)__$q+;ߢA{J|[fj3J5huBh#gxNkllޮtcηhО߿F }H5huBh#iEƓI [;wEjԐpطbJI~~>U|e~,[>_]?e8^}] bܗt(vyMȽx4kNk!jf}^I_1.- ˊ@*7fpMѝ\n7L3a}>uOOn@[fxr| +~-Oׄ!5$2z}Og v:SW,_2Y]Zu2}{E6l򮪮Z\w|Y.K5C+J g%TMRO^Ogx,WJAMt OO^hRўfLܯ؎iU7KUi{ߗb3֋]bq4rKYcqAjN ,1enZ$q"㠨Š:5ϵWZy29/ W !qbb[|'R`F)7].(24^}pخDzy5YhXJٿZ|K ;̢|`!;Uddpq%d@[`ח<~:wQHR4KE*ݞ눯; Xs a[>>]ubEv@5֎F&+3 p#-ߗfFH8dۭ=dU>f; FDaZ i3NX.ZN3DȗL D'H-0ea=vu˩ NXNie<\KNMcrԈke"$܂Nk!S`'NjwD]/ϨgeԜ9i.άB\ eBв6M_!R#'Nܑw9Ey Ky ),)38:2Ɔn.M&qn . > ;šD $545g~>v rkm.FB`F{npFG0rmk2B|^`I[4l7p[dw@c80=]>]QˠSJ]:ft3q0 LE\Lw;6aN{b:w.Y}yT&؅|p)Rb#;yȑ.Ԩ{EjMSWj9=Ճ. %6ޤ|`J.^:3Q=a 6_U'(L[}Lv5]{E_^~Wxwo5?z pFIWlϛ'^3|uj5tkGk؈Fw }KT+\ȹHLrIm ,vcuF㵾LqF Y*ՏvT0~1wH0"xQn+7(r* s &SIX܍JWDE`R ZI,-&!C)m-Y=/re4NI Qh2Y?Z'3 ]8#%=__ȻeEcɝ@R QF8ZBPN:2Fp^O'6"3Bm" *7(Bhc"ׯݹe6yO (BIPs"1-@M q,8-<73 z;뉐 p& {SZT6˻1k|F5KJ[sc" Ç*,=c*Xm\5JJ`Z̧nZO7uFf8J7]T/#GNSRk8UPB9&Q)a%c _Eb0E7 ӌeEU\fH86JSn>sbƗFy|I4FKil7u1-lQQbw]•K96%:2FV[OV :_gѝgO߫)NiOYoQ.nxn1dTt$‰܎HESs 'IgYڻm v:lSC+NkX{m7h2V|+A[q6^M+ӲdVP!L)wu\,Fwm68(!2; 'v@l0m#my$@%~JUa"h%Ţ>Խb50( nQ;Q[?}1 {5 [x.޿)M^&%ڧZqPl ;2ƱEӜŷ4JQGL$6,pg+g nsa~SHnYγhä&3E-[frZqXܢ@1V:QJ!@CGؐ~ ?AnJEkjgSiRy!ΈC Zi&\ۋs=Ɩ.n`'N5J[z96Bw9[f>G5SeI]yp&#^".۳;6*sH|Q,}Qs.0 (M5CoN(]{SDz**p:~nn]bHB.N%yB’`$XeJʶVݙuOOw=\+]a JГ)0eKХZV, /Ø_Cfag~)#6 hf*Z!zČ#6&û *z].VhIu=zHH{) ?]`ju3U*D0v{,*^Մ/ڔ`K5n>7 H1?}R"!IoqHONՂVv!ʚ̐d:L*0a_D6W`)>B\eXb+)_om>esd3^6֞kJ40k A[6 -/-N\ּMϽj]*~VL4{+3;y <+Z>5M5 f{ke` P )g2c0A[ìN=󬞥ccN1XxavixhKL>?\tY)Vˬ0'J*dBpD  f,C.D )cHQb\hE f "C6&Zg~Ktf72=ixU]:WHeĭ+Ҩa0dX0ޘlvd3DчXۢmDcqW"ߦh)f~7Lpfix* )KuUo\rlp==(eIZȥHZZ^ck3VujHy}1E=z>K dͮi-;b:^JW<55zOe+_[,[t[;rB1aץKӒ*zq`#D?[sBџ =؈g-$K"osdq,˟Wڝ"8iοv7wNCw8AjZ@\ E^S Fijݵ1ɚ | j`FL;$?=cPT1}EŸBzh{~/@E\lhF1L&^:m./3{:m'C1%r92^L+ys~.E)!%$FAiEQGP}81b`Zkk$.cMpQjC@ɠKZTo5z6Pjs{xhʍ/>a> fx? 1#wLOTWNk eܣ.kQ:~vI0/{VIyE-ʳoF"vnUJs*S ֟{y{y'e\T` 6H0>*8T=38RwԴZά?6TIVm g&Re)9FH[` \{JR5$ՊE(, 1/XDV gU4;6ڮŽF5 .%Î桙!]/07 %&i rv4~Xs f.Z/  E-WT<Ѥxxe@P;-q1n3@ 7"94^"2 `pjCQIb哙Fk 3Bp3mn$S9ǁ89aPE)V:z.<6AXD8d$B$#v¬jCfGցkQo[\+FUV!DbVc8 {y͔S2~VXRW{gt94bLR8=B*CÒ .A9 ݠ U[0c Spz?ީP5'xSY :#=\D#k2%s4L&"Pf2C`WZ#$"]@)"=DG[]ƌS.(41+e (:cEVI u+(kN3-`O' w㒋EXK*n8؄[E.4?9~:'G9y<92,TJVr@:92?$i\$X$ZtJ(F3"28vx6,"}H &Rba+IAXzX)T# {Ǫ,wDA @`k814cR& SFP,q>pN _ņ _+#b Щsq`|,2K"t,2K"șΣE61W0~r Q+?V)m-IOkW%Ju>ɽjp8!\N95t1̱p1{P`% 6H_v)kAO|z\ʲ0!mdL0iCfA3W}Н?$]nLyBo{6VsBS.Y·1`fΛ柦sZ>j/|fo|cs}c{gnml>fϿ?~& khh~js6H ZO{I_Q{xv{0p'Gѯw"wzc#o}m^ % skԋO'apGCx|W3$y“Cf^uL(͍I3In(]^?_3MgÔ}4휷,n5֏#xCem)DsO_ŶO1b_7ώax}.[ 6|م&j((P'V"vpk^?ň~)u|~m [}w3+[!打WW^2â_v_{mۦ:|?A (n0' r'(f8u<ܓI?\}Ei[p׾qa5.$>o}xRMv;w:.=N\uzGIzڊoz0 $AH=, 3/.^cc ׋ź]slۧ?vl_.;zz#O,} ip>*]tz/8I#[~丸+z K[ svxXwn3C/}xZh0cR8ઓ㑸84h(qJu[l!/o`x1ܨ@ Fb`Pe8wa\Vn~#.5`B?],[i U-Ǚz[1E'qkE*2 8jacd+d+=cƠ6^ZrVXׅ7=Z,\3^OkL ,R'a#LX )"& @%ȓ( %a%ǜڸD,,ap,VBXYOBXޞ>ɞ5Wv£ V$" 0K[f(,E$L"I",q)FM4Kq Nȟ1AJ Ea!C64"5sS:ecԑ!w#Y0aN SwT$ ڰz֭VPމ wo_o﮿`z1z`,DCzcSz֟ؓ.8M~+bJмBcP%Lv$@Vlm DQ5$ A[r ڒmU GSHu{v|˧a- S&v|Ɋܞ91Y%Òco%H{\")?ktMx0B;/~E5w}/Թ?g0Ay-fȮoJEJ٠]rIò}Q4G/*bh9SF3l tTH V;9%+w YY\gqM[Rssag\xF\P\fg1(87X<%%+^̃mUv_S5O[3ؚē_`[V+~娓JbEK!Wk)%ՌuJ&B0% >U5&dIYe޹h× !IPC=ccL1'\F+F4-KrBjN-&+7X93{|N1S:3{<3{֐5$!1֐dڕ La#.!ДkB( 9*_VlBp-VMmj"5'/]_ʴwIS1&XM)jRũuJe"TuPvɶVŴdLviu_Zoo_uR~l秵}.?Ћ~巒Į͙\M yv7~pO]5}-qm.z>8',qo2 a/Zhff$Zuo5]@T (N4$ͺ)I*ԸikƯ5g8NQTA/d]A86rlrT5B?X0%+r 9H>N|Z=c͐ ]Ҕ])B+IW~fSɷ/ylnlFw3 ␔޳!:|~ޗ&lC LQlS>l?y*ķ0:x{[=ww5.zMFMc)Q!{؈Ns㍍pncs5dk}ic`|ia4峾ja!}oaӸf|Pb.u 2D QvxX)ީiJ_r$ʩ>[qʇOnNe;J@0Q͹NTnj<':7É:Q J HNt{-[*_<ʣ>TDwƐ!{nĺf?ʋNYS|n@=Coxn}[pN?R;^p7o3<=ܺ wJGW*5}un3<{x>.NmfMkxk@L2swU5or? ~?n8tOWiM.0q{+ 䇉o?MHWgwx?prm.tcD6[r>D<Cx|4;@0p?k̚<|u}B8taYxNDAa;bA:M7r|'Zɜ7{Qs'WO4}25Nޜo~e {uCw{,:2~EkIk_o=x?kEVe}zJNEG맫EÃ̗~=>kUiV蜥ԞMUù8bۻ> !XyX,Wz+}Nr;˕Jrgҳ\ɕ~x^62.VZ^-C[͐17xt鬼<=ysoqiU1a''nap(ICP;l[C.߼M%UaP1Wɕ(6[У)%4/Ac"(L*I锅Z3cNRKF‘k)JWSxb_e_bI񇗏nr#Fyy0/AB_jbh+&C y qIrP1RE@.E][pmFLZ *Ԁ =j{ŠH:uXɊ N*_6D$)IvRRJm^1pe R^PڷRc: =rIC"NKvT& gࠠ M6؊i6JrQAE ٢XoPvn^B'JK|Po*)kIa჻bwL&![]P`AoΦ0͔=]R~#nՀ,f4\-r]ӌl{,qA r~.bIḷ ]w4X5(% O= jILMl4Ӣ Cr*ACDӊ6y͹!IkPr=~+URDFACi)r"rha0Du5#nGU=Fzj<]*v2YWd*lO7AyM"?^*I˃{l|4RrMZ{kܤ4*ʚ*]U i3DJԬqY 3Ybg73ù, _~o'P1'f+l\Ɛ"FIsc|SՒч~`B8ЁAM*N%^>(1jMTWղb$kCj3y0ykb/9BZJc[VJ1U>~/K:E_D=Eo)oelLbNp4ӇpHκ?&խjK<7|\^ꃎӾvqŗ/h hfܫfgaupf?-~ŇWn>mwΰG7<ΝŀƏWuqy}s~]{&8 ŻI]^ƞ W 3(rE?N}iG)!+0IA?4}=FIO3HUlFokO; 4Z㩶]Z 43}azv`6vh+Ј]ўwΌ{;%wG'~' jxuMIIpU1/x[Y FcfGdH ̓mǞI+ !; =4\މIr7>t&hF8j2 -h`9&6n7|L~3 `ri4F 98%Ǩt4Dz:#ЖLjfyJVQTcЛƶ#TfgPrT,$g-*@B3sbF\6Jt6!r'PJ3,;|zunrrAs}"JAQKD-b%4MD-:<#3ؠrl,B4ێ%nb#9ȝ܍I4}nOػFr;:Bk,"{Q(a ULz}5AQimǞD5]IMa" ]ܱ oI&ÄmaF7? Jd؉e :_f@RH)t=ryA&ol;L j!LSEr?~9B}`$# vY 0zL.9^R{@;%!h*g{ێ<380$b]þpH.RDAs4s"4IhF$=]M+3 Y:_ufvemǞ%HDH(s'קdW=w ~&ʭ{sc֛[[#Ľ Dro\n }<܅Zvl! ҒJm`;IA Ɩ*~!̩xkW) }q:#yBS*BI5Ƈg70@i,Љ }kTHw8X}оjZg!mk0;٣ Zm(T!7SWޛ+Ѽktӎ=80_HrD6U322Zpoy'#q mҌi`hdŚg̩:ɖxwz9C/O4GH .ǁgX~zu|]A`vhh'vR;  f(FS3oEsB.𖋱\MDG9*%Sj5!ѯ7˻|i7ď%| @Y!SѵA!BߣD42U9 U tU6^ٻFr%H|J\, v=S_-%OxWVwWb2,YY(SF 2*cL8qעyzx_VA T.OrE/V$ڗLX]$%RW) R|n1`AeYbL`#ZA E5R.afsÕ.2hG>LTۈN<)UH*$%-k#h%F$>)ة^K|陜nmԖt!6Ls{EF)9a.6@d/}D$H$O%` ЦFn2m$2B&TˁDbL` .i;2+;9ӹA0oi$rYIGА*mYM+eVMHIvR'AvRgrG3 4J%f#I0:rzfyISl_geeqWUs1:Cŝٳ19Њ=ԇ/b5nnjA"5U[*].KZ8FhB 4x-CLɳ}v EH+ z Zy.X8~IT)0|ǶtghSOIzUܰ~>TX7ںOSnT2zbqi +6KXs\Z `0VNHTfS8< 5f!$biRHҐ]mectOl2Wb8|*횼uKJ6ܪ~Vm$$s(Q9OU3{辬ݾxzY;0bǟ.7_} S )Ӎ[]| ]!{"K{8;vVǮ6 Yy<13@ƥ܍mO?Ǐd_=-*29rA}鏟2$H~򆥛k_ƻTyq-.r!4- oYO Zhy~P,WdJ'U:Y']uRZt?QpOeYVm]Xep%K'yjG+.8zu_t}Nj. T*]{=z8|P{r\޵5PB.dXs}rɥD+Ӫ{o@8w0L-7W}}7 {}IyW+פtj 9D^!cV+e$Vt5u+Z+ 2@sq93V0-\!J%pRuKOs.x ϖA\hc,&΁vsR3{2)J9Դk",Y->r|9}=ܺ_ } lZu浣QyDvށt]ǛPDO%?%qw~IVmo".nI|/2V,]VNdRIv_RҥUHs3](ɧK|?^w!2zSw>w>ܜލej,=4\AODթ}Wz;wҳߍ8cK9ii6ksPn4nhn̷LnX LPt6t Rjdlۀi]}Iq,:$G3/ՂvkwU{y>dZW.͵Y5kJK Ԥ{[+i4QʆJJTZ)P;, #^U2IueJI6ڪ*ȯ"JmUyDGV #2e!/ژXV[eeV:U5b,HW * 1^U9Gz݆2؂{CӏcJ- $Fԝ/D0~8hJAr 锣Ό{V*iZAG"mc v{27xCoXas_.,lyǴ٤Ytr,DMBKũDGO% )6+\Jii T;Ѧ'r[=#O31;'=Ok9lV\͑7gT&H̾=B]4"{hΙ|npjy.T+_f¢`d€s>0wޚ7g:3~(qҗpB(GjFkS F"}NX}> r*k޳4xxO+"]`<_ֺ?wBv3Ǜ|cG85_lj-RP0/@YlWgaFæϓ̮7w)s` :+O)mFlԴbC2e|' _zVB[1%gi%x oO1S| hy=~B0f_,>Y;: lP?+WyvNnE쉬aR%qk:Ȅ{!U:dcĽ$5? i\6O6ɿMKw_n=ρpð ܈6yCY辛U7:21x5 9Kڦ6Fϒbl@65`0(Ӻֶw-ra0[;2<J2u~*M?i%soZQ{*jREUWt%8%P "z%%Hery$ꜳ۞g/nDX1fa(yjw2DaWec#} Sڎ1& 1O{Y\ޛ+Vz qi0ŀ>Vc%1p%&M֣K}'wL=SNQ5M1[BftIDMD}y~1J1R181m "ʹT9AGe%t%Vp* &yuv}˙V{cb8OfxJUaaɓ<#3?q@(=ئO(}6vR=JZOI8Xf}q[s`܉;ȓv&!Ʉv6GPhub Яɔeӽ\z6ѠGf_>֜B82N=͙bnShr!p>(!'ы\Ò)998 A{Rq0Ͱ%}8j>T+ԺhةVw* USF(ZDtUF -YpZdo)* ZEVEM;r)JNNV]_~aP`a3IpBN' ]s6*LtjhV⳻CaJmV Lb`)^VNqC+bQ BUBt0J 38;j3i B{w[EoB*Ѭ?äWEҨ1G??n*CtV/LUT]{Zyvݧ &Vcx1MO/4x1H;&X~!iF,!{H6ChñtKKO7L!(F,}|gN>zR,؟?`/*?NjwL)ZZiVچ> U"5FG+M3`3@gGNܵL^]ဝ]9b:7$l=;3TsFͨK%a@sw|mـ͒2l![&zW2uh+iF߀φb3؆Cn&买~vwM#=?dl.3q᜚I9)c`cfgI}%0?̲Ѷ݇ t˶%3o_ֆd-aPl82^-[=| kUBPX%g7[1YInO=,/{fSa)Bd$'F2=[ŕ<TD3y] oƲ+p>`Z~7IsMۃXL(9u;K%K6eE\o1l)PD\C-;:&gґ@/_LBn2<iIDu ^;CKX-#Oml/m'=.θ  ^'坲 Z<Ѹ hCZ-:žeDX/2H=G5`6'p!pOJ/-:M .MsY7l3~jPMWX}XԂ:GDA>zJNsA쇍$BlE%MO'քn#ByQ^R,Oa Dn#%KiJr!h:Uo'9&E.'TxYY~niHX A!)x+>&{(~:#(Ci;q;f Ev(j ŝ_=\M!dBoK uǟpֺ ~%vaWjԞ#sP=jX|5<6' ydQul&;?"߄\wd*8x*fXKt]<%aXܜھg*(*&)EU.n!rt(\yݑcnm+owf7"{4Zؤ7%t-(2=W融Yξv5lnֽ:?P뷑\2يeɼ\W4șsWne]HVod'aPU[u}vmdB2?.1'cܱ3Ca2ql@v*l[0G3"D6 7?uܨQC3|y } )ޠ#чBH1􆗶?A̟||J`b=REJ?蔀 c>euRwnwq$n6C=T[/2-kK| Ltvx8cy mMghv!׋2R 6M{U6|ei%x"oQ~5o)cih*%j8voEQ-pPUB4A +e= n̕HSJAa)mIo[\0^ʽ9+ .S {dəguM>YIFUU!6 Z6*^{Wr.c(-bL7uyCc .>4'IbL}F,ܭ3_" tG;W[7c#6~sp+p'VNAF%_q<P{s3܌uZ +Q;$uh`/QBAS7<\dDVqLNg+5_9ֻpͿϓU/{q@B(f샿EB"*U'u䡩#P>#ۃxG3\KGCL!)y?hۀ"ǐ7P3A1| b{!mlx(D}'.Dސ+J-'9}-seY{6oqq~m #QB1v_.m(mB2~W=~=etJ6etB\&RB5cCF-`VMF=B:vlVũJHL::]2v$J<y~m+Еu_Z}u%%eP!5lpkIVD~JƗ[+ 8h}WD} fI0Lk 5Afyj*a)&^5#wNev?n^b[OO\u[;p {Xb諱z-^}ue;5^=-l\.waqNbssv_#Y Gܿ+&:U ~N U sKc1Dk{6tB](yxhuؘB9GRfm{36$\Z9:}׏c.N N_LDKd&2QNY W$iXP&"ttlZd4\_ h>LFNy?URNd$z&\k%@[@շsL0tap} EXmVoϳeߙ^yw/ڷd&}48ywQLr`Jd^h6׋'τ$hb>T Y!̹.z˴yi=N Y6~3_#JӺh|'>k`2O&/ڟu: 8\r`4 `2Lt.x?I.`_1p}8 =hьĸ8Bp rUHѠxh1h)v #l7>,$doȵVtzn1aɢ3uߤlb&ӑ3~*{H~IrΤ?q\Ïqffyv<;\N/:BxG}VBaΨ7I \~?̻+? [ЅnoZX#7ǃs[Е&n۸cءՑ$pTٓZIϘQ ,ˤǟg?B;}go9&7_Ξ,O4Ҡ{{4ǦU0yګ8ius-ȡ_Z7={ zf|zo_~94H)ZhQxmA+XIo} [7RfJV-y?N[򬾥0Hns{N|v꥗g%Ñ{Z]F_?'`xa,ɚ 4R˽G;#൶féDxvu(`n <;=\WJq8Z:ȮtV=;Iw闳[2-/zƿ{ٯ_M <ȄUgfeփ {}<QMbXI6ZʉݑŽxC$ia|u^ f_xi #+M}t4 p~By8e8tOur=<3O~y⻽ ﳊ>x8E?UP? ~N2Gʅ ᧣1 s_N@/ڝ87;'M=-SYͦACKƾxt&Ircw|6h;g*2þRƢ1^;3Oz?7+•2cB$"GQZSw^麭2c/tfMbolƆolƆolƆolcU7kw@pFq\ER㠀pB~̱X4`=XekMvlvzϦ@Zx4_3 ~dzN2];}^x@95awT ,M=vx"dwdYdyN'?s hC =taylݻwﯾl;D2twcs&|`1؅fz aH07p'4?7q.ń0!uy6o<=yd',UpUAs`Sq.c&~gV<+6{rJ=FIN`%m^07!$޿|\ dCK/81:o }P@ Yޕ}r8#0g458}30W8f?|_Ow739 #Q"HHʈ+8/Hz#"χKA$@G~qÿ*>!"aWPBW2pp(">eoN @Ffh΄˕|b ~1a4=[R<}w sk)Ρzì4o6׷v_ξ}ow4Zn& +} O ҷB[[x}vƼg ]|||$#?睠a8‡V!G)<$uB IW32 ;(jqt(Ŋ*&wd',n(EC)ʏ 4{d@-!B$鴃2!8섰(ʊSpd8"[BA*VQ;VcxܰU4tVr$ C!*ʑ q P&h0 c M \0Tp5{bE NQD[r]NRLcKWzoWa+~iNHP o2&-Y[,0m{3(=oπgąӘSCL@kV`(_sTT)$dJ`0/@G:&?_BwPjj߆5S}*P (pw'҇P1^?o~ٛclQ\~bſk2P׮P)Qcȡ 2+wly> ~`x=fci1kdhߝ^/&M~Vv8H=NJ"pYʨuU^4:GT2L)Fl?]YoI+^EgD vv;vREJ*EUEc,3"Z]i}0Q,(IuTaJ_.er?C̮\Iq?$W٧Y?8g=)wb~r}2'T?y{k;\sӽY?}p.X5J7~,O0ȭڪy]Њ^&?O2PrVv]Y'k e^~Sw5\^]峼|σW NEӦWUMMo25rIi=vu/; }׼^W5V"I]redvh-k^ެ2e/Y%嬎!" ^璌eb=)-(91D-:Z9ȜE/52tBih0 , 5Z}[kLF 13%cA{RR(xNi} 4e0 :`8ѵMl,O3< y%LHZ] R>j)rA^9n3xdI:%v^!D9Q9dƕoMX<7! Quژ)\mϱ2J|B)Pr֥r,˜+f=dڊ&@@H-SЃ’J!ӝ"W CU!\ Pgșc}`rȯ<ŜY /P D\2BzDZg6P[OUe*GU8GYӸ0wji&ŦQL39hRxD1 00fE \h45!G"֌%FQAطCQ|2Ayfd% :Kr!R3J_FoĬbۈAR5ɕM)T^]P81D$(POa*z(A 9PVJ!wDAw݊w ^-CdǍ0dHQV) MeAy,+b6^DQcV*ByCcUH-%:NSE eq 7ng>k5n7)l>+҉4PT3gA`Gg#`?=t*ݶմ6$"yrF/GꎰV]XeÑ5sYL.u1fOɍipg}dR. 31k])KW+P[7&e.I.ߡedS9as>_3~XFq,X Q4&m6)L250RrU. [/sBjrJyhΓ B+afNJI6L @S 'STV(HP\ɬՉ$':OaStt `ƥd9SNd/ZX|P$gJ@wvSf)P]J'?P#1CcCyJܒg@f4$`f<*7:-1P*ɥ&˶aAĂH Q@S*0G"晄TDysL6x<61xf̒3 R YP^'&d<A҈ۊᆬfN֘ۇz.W'~.ӋsvjWUӳxZ#+mPx/_\=py-`Y7 V>c9^'(0_Pُ?|sB9bm,WA|z"Z(=Р$[^|;v:hh٧5_.Vy'G'p=;' [F- 16P6){qEz5ar^ONY85mTksd ƒ`:[m'5֜S9fXE͆"moOU,[B@|=s4G#V{uSd17ihf g FYC¢D" k; }xgwg5Hl&HǗb^0ngmaȴz%0,55yP2Fluz0js=3/5FԿ,m Q{+x[UB(L`nV+saBzrZNn$f/+ek[ƪ&;HEALU>(߈ LDL\_+Q5)mQ,+!zߒ8#"X (mUI*P.: .jڱ} Uh542 4PѱOY*Vk&mLJʜE,;=+Ste[y!J8;VpXHvrN.)3ik[_ u';>/a-k6W S m"+ޘ+0Թw<7/k<Ǭh@cd^ s:!-8YH:X1 n:ԢSiXZekDH2!-yhPDD ljP@wbV6-M71Qd;rG!x3 K2=o2fݯsÉnlV)ڂ~Wj3ҫگbFkY^ ޶FII1ucPijHHBrp/K KRֺ/Ҫ[,MZ)r Kn-FlH_KQSjӔxlZt4SN]4ey(Of-,v=1;.>>R}+Ƣ3p\rX88!akLjn(&g},Ȅ¢JrČPڃ9ap rܼR%;Vxk/^:x5O $[gp:;gSs}yyʫky;,IϗN `~}OՍ'NH'/dI2 ɑH[LZIHNlwH"Lf5@HT$&0{!BS:?ULQO\yb b%֔ I, y][=I"@)8sg\sYό JHHIW V$ fW6ҥVLI%p8=:6$>^Uk/ŇOfukp75QNSIǻݏꤪ}8dYNp>G9d)VeeyLۇma!B։:㨉NZٺI(`AK#kYZ4gG.UyAWhJO;P?US:o]a'N2D-&RSQdX4ѣ [,H<;)FGH8R 97tS oŏo9{n E7mwl. "隆xuc1Z#XsXX?eXJx؉vlJmjCޚFOU0Hary.ݴh8[*[SbY!U9 ] *%erʡs[˗c4fj M%??;aG :Y~op7BqCr/>hF8H|Na+eQ|ߋf7g?\;[~]OkO%=9DOU4%#(#/gZ2.Ƞd U8T~>h]cD&[Ne5kjMAOS;:/E,=''^ŕfS39 `' o) nn,?< o,9_Avb{yr* >9=4r[:/dPe[TɄ8-k rRHo~,XI'"9FL!-?O"U\05v~K1QsǽeX\7t n}?#(v;+mH()C^;6|`^f@Xdj=")xYFZ*fF}GR".6BQѠTYҤȅ6Lg!?)˔aJaJVS#|(F B\{l1(=,FBUf"T0[-5Xb}|;|=Wn_3h:0@JgOA(g}f圃mLhv;Ɠ/9jTL!.gp9) 83EDNLrVda HOUsRjwv1e- UA+딵DYo!o6 ,)fPBu-#c@͜欷 ? ]?(HJ۟,&d9NNpo{՝acyԣ`yq%}SuㄜN$_B a efRpOY5$=foKegϟh++G) Y4?]4#OB"Ypyv.X |vUC YދZ _W犳dHߑ:ڙ4]k`+)/Rb! M{S:׈^I׋H;W(gmЭ$ޓXEWC Xٳt &ϻi48l/V{ikFU\e.1vT?qs6PqnPaLBH8A&0@G)1 rk1KF+\X=I+e/`d%;Ă2 T9G 9XgXȈaES G5NI$n-5zYsJw簴Ƃ2,P*"l?ND)L"sG*ۂ66N_~bph7^>UTq:`8*-p@$Gc&iY wNb0D| &%$zMdG / bX.aJ;w_IIoe?bw\Hi.-G!*.(aoɷ,~Qy`Z5,F"EW~ ~AM#3@S̝kt?aO|"pmi%`O7_/%&ZJ<]iPRe`RToK꼻A2S~p7Z$NIEIO,3ȂoWBϵ\Z]du 5LVװjuUs-9(qZ rDS` XG9ð2ja\˦5ʵl:bZ+˽*/ {#dU.>8I̺=OzI=Ly&3ѥOeJs/o<9,ũUYL/(C֞g*h"``(TRLȝ) tG|AM LHj Szf_-py UW,˓pGkhHr:WWlf1hp` %7QhL,bfQHRm$.vV@)^ 6A/^n8whOKI$ &`==Lo˯IxpGƟPN{4ۛfru7Wեj1wC }[~`^>gm//2_^\bF'")$cGJd&f8XgZQ*xuR-+A&'`!8 .`w"PibF rV;Z?ۯy$O$– 40j #k l4J:lm |6&8q3CT(: X"cbk#9VSdu4J<>I cFV﷛נX*Y=|w_jZs9LR$e 0_`y &&-pr%86! QkIP%)TNJ}<ƓF9 `SM2F/G2I%akY9VMҲ bzɤPɥ&L-r2]΂b.%Dn8`X֕gt7*Қ78Pޱĭ1eV.z .t璧jҰI/gW 4Uia@!M{'  TԖHP'iR";S\ϵ(D nvSm\Of)20E,.P<=SNŵ۵:tʻ M:r}26(G6\ڻmPFtömLr%h!XGy``J .@ p"ZJpYS JpL#.ZlRxF,^m}0 sUW΋hOpz?p^1"mYW^ЂS+&W(U⠚LqXK*|u"X'+V-^g'owrJVd$ b1k0ńpo-/( )ВJ[GLurR[@Lg ]7<8\tZJdݸŎ4 )AB͖d=U5"Ui\k; PHU,|\5=$HȣΖk. L>̝U-'嗂mM%e6'68⥐ԧ]oHTiJj >7 wwZӿS\!糙C<:WyN Z[X[e̳qt`.GqCŝթ1F DI0FT³ 1Ya0%#ÅyٱV83.Nu¤uncjFDſ-sg&rYnX|fgua7?$D壭Y~ JFHH*ѝ C'jzֺZᘢxΗZB#COЂ v:֋:/B%VLH haha5ZUm}Hd,=~۱a۱aVoǪ2dDB̥”2\T (C谣h:Q#n:2a_V#iӘj~h@$J hP&JFeUL3YtYf`n.*T.=`g%J] +-sZF.j)U#}^XŬߤ}ҥRwѴENXi!Gss7?p=f%]*.YxPp`Q_h )oZ\kx s[frg$Y%]wJK<r8ef$<0:D+*~Z<5YW]M%Aϧ.>zf?i\zgM䉬TlS":k@5gϺ}'.jY܊sq8Y֪uHǙKu\:o^8Ή)plcY5X# <4\@*A gYѬANy eNsM#"A9A®1tq;be#$%=4 /aJlx}u%ZI~q=#s7'>zq*4MJ5x}_nŋ/FI|XEDIZr7K[ŚJǞd 5RJP$/`a̢LƵ0#/!L \y!^r6v٧WɢB VݟZ.ث:e.G|88+1 lE%M3FKk[M8ls$I0vނ?2jOhXG_~w@f9NpJ8 ™‚Vrvb^0F?.T:Bz~1z0ss}2I&T:= ܙ{ =6Q$b^mbܝ v,-9{#-b рPm1YˋdI.whATU9l  a^bYDDZA.:=qл`Qv)l\yj1~N8H^gWHE]PQK=Ow>E |=z:X?l#9L2Ix4iAѥP=L3Ȃ! Q .3qbձɿ[-O1ȹ wߏB 1[8i&FyL*!anW*6',TI+Vϟ&#t%ɮ.>Yaf쳙ޖBkv[ ƛ>/?8Seu !tw}#d; wukz>/qSulwpF$bBׯ粧##ZtU+ {`թ<<̶T`ʩRYI iB() G)&:1Fjǁ2¢$ȃVAs!;FL +}@j_YbrK95-]8WrIf`/3(L tv'X~En˯nJdu@fnRWŪbXP H^v2xq`T ?6t~Oy߈%M[߃R%J[+EzyKp0Sm.}5pT)iBқCQI#Lɯ.%ŅfSyn ;J BUUPO|VVF~[Kn^]k̋}D 1R^ި O;r'՟׿jbT S"m=w8芘 KQKRKwL+ ]J,}Zɂ@=y=2 iNd'ғTz_zp+~BIdo'Jib;SQקnbBGң6Od!9Du4+=Ի py9_S{Lv!̐Ee'ΒCXٝCCO-5`u>Eji9Ԕg_PZPMw PNJ\CvJX[x4Pg+?IT>&F`{DZ۝IjBt|CМ}`,PLjƁKGڢ3܁"ND!`>&NZ($P0,Al*zMrLڴ~ۃB0Y鏿P(QO G%'c8 iFr-~;FQTP:ZF> aG!4ϙƥ?w;::i@{w:hMεls9`+qgL%LA)E 65t@?URMϼckOcX)/9idzb26N"M K DGgv .dN$*G1dwN̠1ǎDr}pC郳 = UtcL=̉wN<ۤ>3n(y>yW;Z99)Ir$ l␎sd&o?/vcNr_ؘqrilھx>9=ږ|7_ zW: mVLj㫼w"}>ocǩJW{?sQ' 1n']/61FD50S,6ZD~>rqI^DL.fw'&qdo1q? +J BN|Q?gOWy4/_4`+8 =#J5g1FBgr?GP*))} PIy% ,pMxP qfKH#LU1A48 J$tv iF3 "zWҡ^['(KZK$Z@%,3R(+ˑU=p+T4Z=]_hOBwYYȼ}>H.om!xu$w3z4X8R8 *&߰7_k}U6~Ƿ|zuu6"`$̗{zl N?B^V߹77%Ssϋ`-dh}O2(x?{>{O<h2d(Q/HuA TK+(= PxZ e.(K mO5PT ~y@ SL6ˏJ$hWԒ/ɤ98\${ƍ"*Lm">&r\\'ȀR^xu6xtp2'퓨Kꨑk-jc/gϻ 䏗ya^Ekזs_r &11u>_^nC6^`}2Ǎ~y{F s:[| J|{#FA` g?"8kș W__}MH]\{~|?2D꘷ٍ>Z$ ՠzxC)ņ|f҆X=p1Xb2=BwrF ÙV"{pܘeӝ4>`xmBE˳l,ףgi2ʳDŒ0J\wP}9kV7H(QVxTF@yf1vc6czS1QUvժuxbU>pT^+2v醙ǭ!+>w_i2_/j[Dg39`&ڦ) TfD|Qqr F_8IKdt s eyxze.|9+[ hG% 7Ɂ`-aB⅌ lFdOfiA-[+i*(Z;'\#adN%jX5BZM; SG{ J+p!phiAGG+_Z):NMh#:ϵ;}ٳ˘'Rwlz)ϨE҆g4\R6d|~#4ix&J>tII3w[}d)hyǨ!zp]c$}#ZSU[ƩiLRڄeZi,LknY4x,B Vgc5ݰ0iYH%&߮FOUxOfR˭}dGn$OqRqǘi2"Jlp=z;]OT˩őTz}e) #|\Y:bY6[QRV f9ù( .n%h Û^=fS/*H_qm87ӈx!.4ZoOAl =Ϩb3\H[hA-A-AfpQ[ 1Z)"gڅ?O4]yr+FTVqrKĐZ Y/CKnșD33 F?.UtTI,KB%AF$+TJ5 J1Uˡ213&8gq9]H?W*j`҄!7;M*H Zf^մf{w-aKV:Bȕ.;BDlM6`.u:H>Z QidֆY$&5#Pv׿ђ?'qi:b#@g_G. Q9}Tʞ].\ CGϫt@g4p:#$ާPsYG 3 3+#7~ !J̫a#X;ǠG:HE:sg{9:ia4-Ka,P[b(=r>}ˁjZpY00n ,KiKe;4W!HjNVS{EIk)EqҺR=' g4<`trvJ[ Pjr2 zU5[e 4:Zدƚ٣5{sA# :C0tVPʽF$6:a*!(4QB[Cx KDpIq@)/Ut-! 6шon{^69T E^6%)+!6FT9NV"Jn9W"j)R*,oVS]aǻʅdM5n7=Kq?x2nᆵ,_9VfY1ok ncw6a/[­ƣ:C_ǟw?/W1rNުE]i|۶!# E4G5Ǫ4z -}GvI;T-o=Tօ|"Z$!T;f\7y}ބg& ?:fojPkP!V_fQ>t̠Zӝur(: y5nUfƴ7]ə12Im̔1k2ʌiQ)k7gy|pm_QNZ>bl|Z?˦ǃsiK0I56&g/ p@R@+Qx! /3C kK3=տ*}F]ݪ+aá6ud~]t ɖk dŇ-I{+O^wqP\BVm  a*VQ4ቅiz!wֱ$֍>Oa_\-edEKE#~ǜ!k?:(CǑ嬼 2زGO9⥑qRl_Ѿ?5a= yohPLdՖko}蚙A?2̆r1';vet ~ϔ j9 I٢滋Z\DVXV t1{ތ!7ljK+ЂNȵUZc+-z9y{5?ě"4SMAv'}Jњmʯ7{at$Jh*VւT:n|sc ͚/Zz0Q?޼W7tߴfx-/u>~1VG ̪m@?HӖpAtI.sK|$0+WL~9jPP}ӝ$yWHMwz2À#JٍLSz9\Qa61{|9$DmƋQERm|uQ[!eVl9G޼F2גi`^3!C ?׏#1,<(yyF{5"tWWz~bIڠB) p7oʠ{JNz1cbC%6 FڞTtWXY!ӲY!Ӣ72}k5Lg\PônNF>wU%\d t%ww7U_|l[ku6.]ŝ O誳A#KBB|{7剽^yħU+skh? #bkmهPJ++_)аu3D{ T+&t@bcXkpOPi/1< 7{ _R E_!%R@H cNèjj?9 &4>V.>cB7S .seN5̩Tɻۘ(dsϟ'_\ >Ǔ m?t15KYU\n||Z|gQCȮ9hO(q:DEIYtz :5!PA"Iin%p5iI=/{|ǥ{0gfrң^ NE.ɦ*\<0S(Lh(ASA: 'fInk܏juAUJbT-"4еWA E`9|1B =p5ArSHb "Pi D$.%a,t=+R;/qmҐgc,X&_75[=D%Q+k]LL?)y2-MCh]fѺ̢uE.ZN9k68a2v_VӧdtaՀZIi2?.>/T+U _i|54CP`"\ROMJ!gt|OL黧c1O`;[D{V}p8xzg_!tY߽xBRwwU9޹}Gy Qt W+85pzP_U 5cplDK23$L{;?1h(ΓSZ!ve9_~W+m`&iƆ̺c6sҺ1[c sTt[.؄pF5:*o}xZTmۥJ K޿Rvm}^ٝbfÊ%Z5أqpTrP*NsFo z߼g֞(tAlj[ znq\cZƚ#pEuĎ~к|G ^)5*׆o|G,S\]+hJ˖ݗ { UR OBc^ ܁v?N~+mxpx0pڎ)R񅂓G՘svI5&b)h#ed˰A=NoNɐKp` J oxvۀVɍخ)::# L.x!a%bH}Y\+͚=6(1 Npt&(CkBQsKGa W!2㓂 Z(mT 4DNQRZ3[uh VbӼ偨&ڄC q*UvNJ$j, IDs?cM(FP}1>|E-,5E8mce)la(t!gzpgg1шnb3j@BULT -BkdrηYH!Д^x@xxΓq)hZQCyU 7k­nDY_=mmy4!i<|Ew8fA*Sozp`o-2^=  Xv֍_JmJ=KzyB/x&8?Lmכm$!i,ʫ3o¯߂g[fMYU˚k}`|Ze7lͣ{ނR-zߴhVéJosz B@)ߵF@;$Q',Ю4Du@֬l%;wxKvN޻[OFG%;p2;/n׍].w?V(s*^ڝ%⃢Cj$=3YFmH#&In;gQhcZZM)fAIŠ&eՕL C9[8%ڻޤȴvރ(&l3z})%w']Kq"uO٭v9~/kqTOn0Ӄ4uJ >1aٟ4=0a"͒]0>? ]?6T<޵Z"f{e=V\}eԆ.sFN2BPAf22keNIȜ1 C3F9T];(z(jI!LzC64"b*Ɵ?ޔVG zh) )$S6RmP'Y:9*|`5IAz#|o߁4h"Ԋ`)*F&*0+ DХȄsܠґ0H40wm _vG oz]ku9 }HK(j,lVel9 ?;jG|5mϾQԖZfK|vPs s+s%xsw[ݘegS[['k*'̀.?^^M'#YOyKϮsԐVF_E;_cG,>q1otQc0#]]ȵDFZs5^c͋hbD$ W0U QH*_ZSS[mɚ@S37{AOY|A-d L&2OcԧI haޥQu_s p]ݡ1qEHG_xi88-}n4CTGtg 瑞&w6 jjףQp-6kyLt=oRĆ/ /~zY<,-p 0c:&s/ ٙZ];ls!j8l$N]˜"$W@*}zn;тHf-Fx98)a1U>U%‚ɯ4iB5.GR.so=d"R%cu ˌu\i2QyDǕn cC/ Ic@eOHPR+KI \ 癍@])R jC5QGWHj') &eR`4gJTFFxN'A&" #h(1J$nBP陋N(XfH'Oޣ[Iw«CԚ]y,MxD z,YCxT>+"jR"IRC 5ѝ-pʋ8DG"r1h[q\ѐP<^LmH:8d t BT`(eQ'I%*wAPbҨbK8: }.0({&PNA"%@0RKHʇ3-Kh *⹆Q!gR:xezH_!e+h`fa;xw^vSjMIk,EvGY̪"9M@IʨȌDTt@-} L yVW#Uͅ?R#=muBTQC$$H=b5ss^LB&BL?:9g\wK55:AdL$%M{| G?>Fɕػ;a#W.dq5+.7!0)[4Sk{Lu= ^wl(#[>PZupIg0}Y51D\]}m1T$W+BFڪeBljk]a4JGb2|yyٓlDg._zT2b/=eF?)YS#fǡ~HjF^MV]!VΛ$p>Pϸ$DhA@<\,aP&{ Y1<ąxE]Qwg ޻f.0&軗%6肿QC QCXq\j9t֋z_?˕)ݸ4_LйԽ9ѝIZpJ"#tUH1.&%c9m2XnlXs  =dj5- JAV* LSiLΧ(XF,8 ׂUhme<0V'nP詩AC`=IL[M$"pa2խN1SuYhiEz\\衚ZE̊7; [.bFh= @~IZr{~&3 }kyi -S5pRXBg*ধ6YYֺ4ai_@ dPf1s5l Ҩu݊k*z8"gq&G>_9?h [)h§͏wn!WJ HD h:Py[9_ Ej\sC6mSˆ<eCm- a:>.FO70*T MGG'r cOJi%PN&&Pc/nxs7[)Fנ 0#~-zo5j98j gͥ#Ou!"k\$iK8 4zqH`U;:Bɠ}qY TBC.bFG>`3ƫ[j\ӭԸ`pƢ$4TJR×Rmx.#$E%3o4>,ű4ő?<8\tf$aRJﺘ\Pm!>W*QTP c>y!3Ql 7G])v۳I>+jLճY qD OJ3nspi|z!ԌtbYNǽLS;? eNk `䉕zJ y"W_'±^5Y5Ą&̫)s}I4聋>\v#2L:!lJPhʳ<ڮhvo5owNSn~Cd w 4iٴ'!Z,-Q?Ϸo5Wd0v_Z8 c/~]뺶_׵nگ7C'.BID$: ȳI`!1(SRjH5 gYqeZo|S8慕t5Y6"0ƅ"Lb44{?cNz2M(|M?2@ Ba,Z,k!vF = aj\tڗ|T/+սnV R}~r. )$F*x$Z`M5_]R?0d_R_>tM6/&p%}]RTJ~IhYmr9\y/K"#w41HWIOV{U灏z#]={քKE8QjG^={(1nJ>_ycP8$HŨ;h2^ pr8 !; 2SŒj)%IY[k% xbs!@eFHG*F!E1h $DzᨪnM A3 WAv8tIM7 ݠS{2MA{~siʤ'wuRnH:)Eު+tg6 Pޙuf~zF(T'}2b>`v]: =#*2U4C_шoR>:΋89([_cθ8ù.Fi'3㤹nvng7)$ Dߎ\0>[t%r2(` ӖKHz(u>xP7dzA( =vT}F%q#\?YcMQZծwAscw'@ʙJEk<Οj|\7\B.vOnjU}7 STо@F _ۇFF㙁;)nάmB*N;)=N "KlLNӺS''v8n{JqEWw*zp?#1w3na)C2RU !3A*z︔F>`+g]l 9@+3wC#U޻];`2Ug?ƽ% 7QOl9|}|sONMۏv?bgonoܛ~sV› >^f2L˴FR *}zp:;`cy2腙34v8yAhŚY/FL Rt^!5_B$ g-R.!{ϒF!CV})]m͘ţHF!Ġ26,''yP)ϙL-mMu ȱtiS*MF*b3Qzbږk[vYoݻOH%-c%9F1ȁݭ3gʶg hXW4:;s7 =E $ڝà*b J<B5X @ 1h }mx_<]Lh#{BuRwk_A\f:h"x3wSd˫釕f\ҁ7/Nnݾ/\@0>~Qx#+#H#6Z)N9$8J-7AUFԳj\ *P-5G[ #%{!߸g12ƙ#:=1wX{-'Ax*~{b21*0!dU5f+?!AgaTd@$AT蟤J{)+!p|aԲO9Ci@x3}kbe1s{Yv1cIB0.zv論ԢEͅ(l$ozw@0 ( t@UjH=0rf)hRsɥZ.;'PMJ)\߅̥qX`xw$lД3k@;8$jXwz.0䈗g_>,Y[sfX 6 CbB9u'sm;<v^Y ҰP6pjeoNxbBS>0O/5w@d`(;9vh_S˖Q1[)vI n`fۻ|2uچfi$KmD:r?CN)'d lThA'/9VOe|[Vſ"\TpŞ>yL׼_*sU*u]JxݬJ\%* q̀R($S0Bu[]L[ JM-wZf;Y`[ ߷" hJ4@ƿ/!#mV͹ŬI%,1NxxOg:u^s!og7uYFع+G\2* Jj}hP=c X+*=GYdP$#lҙ")c5ʹY|i5 ըC 5&HF$D112f'd[Pt #G#h?SyM'L9|" I8'[ u{k0%$]i WԙT CUٻ77\[rۧHx)R[ޓGqc)J7ԱdFHR= :z.YeB:rˢ6QtUN4PR~f$B`zԥ6?uUw7+RA1(\\Ԁ޸ 3 1X0i^YPksJ#Uٌc2rַ!pDi6HiӾI ~Z00 P[74i1IwhUۻ1Q h&y0nI&)) as(9{+bq 6c堒 -A@"$l I /Di*EHL aZ&+4Tʓ׾9X `>7mK)gh+ŹhE2kp !Q4)9)"B/M Fm/1)Dl͌kؽ` 䚉 ?ZI{Bm+N'_y8[4ms弇0[$N~~-0&~' (4`MaONv1v1Jj LK6dB\?~r.j]{VVZ LH̔ #9!ߪZcBG@FAX%¼56n?̗tOn{`]9W?7՜Tz˅;RwS}SfvU^ȿ|lwEJmBˆCOvSU1gJT|j"Nͬf,)KyrWD_thAV`Q7Ӫjk*zz05q*h \34ϲ**5xU??{۸ݝ]S!H. A:dV"nIvOb%٦u" H)v&H+"7_K֑0v3B Cj7l)5^a Pjf5<vfͣ{M7#:Ai0@By8J-=(f6[K 9b,cA<b8X1Jw|U[l:Ӧ|V9 o"mz-xt ǝ&߲EU:WGWp#JFf|"k:G(wyrb ~}G`p9իsc7 baGsqAUNn0JR A6Q^[pߵ=w )T$;%A*z2jRJA9[(fV˾HX)Pج }&`Pجİ}EF{XZr!Af؆m0:[3)SႱƚp0!*(֦nM-FwBmp:.)4깥NS(R"+)yX) # շ<<,F , h}1(G(а)Y~;MfHը]6= V_U`5{!+N1O/4̣cq%|/yݗwd KvɽY錰>CN)GΓ?EKprNHOp1)z_jBj)Q6jGvUvS_UzK=RI׻dHJƯ%T*!JĺV!3$GVp#gJ9('Pz 31NH)mՔSOHqDL\7b2C1^`3c \2zs[߮q^dm{m.~Iru9r#szݩďcuJ\(i=Nwr]-?kcqmZ'OMEOb\>;lxZ6/~+-/_l_[j9նDoyϪϥZ>Vղ}jzyO%H.q2̳ } e=42r߿2`_JMqLa,RUJěX;g$@ep'v+I(>|ڢ+[DSEi%zlk~um2%Co.ojy~m-5*wDW^elɹX_f7_,򘍀>BͤycA4<ԪjU`iF WycjyGjm{48d8 JDvL^H{![ a%1r^Dz EaQҋom+ ;V[eGvKU )@H%@ CP)i$I 1JIL(&hJ xt X!\t#rXჍZBTh>t:V+Cm)#J,7'-`?Vz_YUWYx15M__J/ty62DOS'r9БEuI٭#+$^5" )Q- e"\w@Q'J}=Z>Eb Bq7ޒ5c'ETbV-I$iPڧK"'QNdՈY17BW";m_Iqh+ ܞ%fUc 5>RZ>E8]-fe`C, ѐ HTT 0BZLܶ{G-cj֙Ssc{y7[W"޳H`B;G۶7Wb?A-=o}Fp$Ȫן>|gG#W4?ݭ+H5p3@lZ6TD2Ԃ?e6zWaՋbq7Kʈb">HsaR`Ӿ~[ZMO[m">*9JRQϵ">*pAV=` mԜm뉚Ojox8jv3V@dU`M:ET8l+1[bTPpͬ }H F1/&ɦ91N}%CW)4g-HW{9626M ;7Ӭ Iscփ@!iiJoZu{D$*cQ"1.L&ёoabr!"T ͸:a#m[20\(;RUV~?\Q`d&B:fbYO-IէGO s |bd8FW&].ʃ7̈oHz"a`>;;YQ?L1_g7&kf07%ܹ{a\&^:,?=ܣhjeK7.q`7v~nU8";-D'Gח<Jfn%=e7.ۉM/2[9}G!E1H`ŘO^RT 󊟹4]OGtX23[]OޮVv~6@5"s)rcP9.)Eu'Nw9CJ(O@*H!=̺4\-3$?|(޵B [x1dIVD`J8*pZ`n:=Kȃd݂Ubnsqq=1՜uiכsVlOReBׯA#Z|wHJzI5RЀFJ~JM&=-W}5[#/ y?Ӽ,O. P 1Kκ+'y2ZtU`9E9IH&zƼ,YeT|ǞH]_hc|u趶Ww3=L.2{q>$ yeWBM ̈́;+ْS\!0jp?niv4V aVXz7O U'nezZ9:¯0Wv[Qm?7!}j1NrUg0f4'j˰`~[ɌtTfT΀VoeCQQ-I٨E_R nǤdz7@."@3eA5/ߗ^͕2CREk-t^R:r%[R0IPIDgzŐ8@V|iUIK\4=R\s+ :]ܐvQxZw7g|veW| '2%P< V,J.eY9Eiga_lOuf@rH9u`TW/wmS{T@H{: ,t\q}\BQ͙2IebQƓoU3Bړ9%*ܣEi^q r'*lyy&L@ER,ʹR8)2#~$fSagՉKD\ɮFhodS mHԜEp]_ PM5Fg9.AQZÉl7ׯ_Mj{~L#+<]Tlp܆+)AWy=;zFԩeaԹ?CAPT9߽+i0ƝyjNlQ"LT)6}REygn Ji1k@Q@TךpN oUQ Ơ$ + `[*3i_'*ZW7F?A$גTD 2(R(&QU 0dt OԜJ3A~$iUjtM@M1Dd1WDWvO-~-" }|> vI%דcZj$*8 TAw[oU᧊+i9cl]u뼠͙YE \2IǠiYen#L6Ɵ⾻f)\bԥڷPt|0pE4h2C;c9 iW3mcE$v4J:|ҩ3b[]8<A2i0r1p,H@AzRm9Ռr2F A>͔\xukuN!luQ{T+XR"# V"$Ȁ[ z3xcEN(K4J\M8fY #dl7<6Z6a+<`)Ⱊ =x3,C{?&?]ǫ,ugv#a'DdVCeM,87,+`A3&)*x O@ oQǟSh3y z#X\SF9o1 ^( $3MM.Ik]60:ȉگ3+*cAU^ˀ$8c$hnįa!)4&8x"TaA{x{f^a{(%h^YA&.~]2Y"Z"Dl/@c!+VeVjLAY{{I9-IV >LG3wKU5FcbnO ^3)%j$4WK{ j'l榱Y8ۇbj5Bz.V`x5aHo)h^%[A|%ˑj5Ef%w=T› '~[AibD,o|/:OY dҎtb۫ahn4)vxXꖊkԦ!h`F:X®Z nTŽ!uصKTF~)UBF `Bm&Q8AxJVیRp:ď >E&K$ڞ`x~LGz'x5Bzw Bi&_|%Z D6y [LEyew1ƈ0swX%J~rt;rD)a#jeuWy̢+-((J\UU~p#DppOp0;wVI`uNoCWn띅nV%^;^^74Bt۶|kh;lYȑbL N1'`}sSIhn#P5#^Ju-w _:DD?NQв4:G fj~su=zRE_쥘eNF~lyIl/F7d9BW'_ʼ͊5żiB'tMPwaWJds֚*<;7jKm4@sLC_,\AclEj}*(0]؈6>~S:o8f))5S(SOE?M[FuȳTv!HC"`gf%MQQk㢒<g|i[? w"%}:ʜ6c,gL!ꝳ2)=zHQ6~ss)#JRc*xcbU63,^\jrs5ۇbjtlϦע0@ :b7ꡮYUw\򛍎_]wɯ,VP :w W-)7ѪIjܫ~N;ྺhoh2?JB0iS)r|ٛfŁ7hN >iso/siZ(BO6 i oW.0RD`JWGxxoM25aNyBU)GΜN"Z)R'Aȇ-VcOL䧀UygO 733kt~|oSG ߀D9SIauLB mh~e7 G齟MHYpv-j´J+iP2 CA,k&0')9$3LCy2'lFPf LhiCmcY#r=Dz@듸iq:HSMp_h%EwHJ2-R")VSX\>;>㸅# *>OZ?E$. YBLc &YvB˸h,^pzB`~ b2e8Sal(P[H8Ki2- !axj\nǖG AvFocn0;$7"-2sθߘqz8bm}$T9j}dUcӅjsf@({q^ KG(^?xZpt*],FU9hV̬)GRp $"ۤj8[]%TH0lTJÿ(nf]U6WqIд8xnzcaAշjD6q?y8˵_3\|rX1s7 VpDՇdzۧ[ +1^P ͤZ,^eU|^]sL s{걀$QksF%CQG<5Ep#8" U[kC#I)$ŜƁ>#$jdӚ8WdNHQ1zOU&R"v/PFayX@]NZoPJTl.mڃB F^ljR67'>x x¨)K /ẀȴdHlj`6K$Jƪe -FiEXGyf bY!%7R*ARpSΣꢨuJbs߿%?QtAY9+:H}9ha0fθ%IniiqQ6>˔bH+,c¨Oy찒qDLX噣A iᨎ?Yk&SRal?5ɧhx G~d'~9H/=p/MVj|?3'%kꃷäsw!n~{cpC( ݿQ:aOj9#ST؂YzÍH0Df6*kLlΎNgC:7:ËN/P e.Ws郇¤s~co~部>iR:,~urhN3yYu{^|>r3sSʹw9':遼T$oe8<# /އ~tL!C?@uM28>vӻ$AUvؿL*mo_'EqP.= a_G8&>~ so`26YX\#i6ṳ^}$ߤ-όW e ;̿eJ.NK"+ZX3cZ;ݲ )r ipXP@9Tϴ 45Lb~/gKH2{`sQӋL9S |z!^b '; F*Xv-ZлU{>gA|x Oq)V?p-\n=#fXzZ;ݬ.ػN&7&'le#a,͗tI[`vnиsn6c6/ ?nPlҦV5Qߣ{˴}77>^ط_ǿ~xz,O=pߍn>Qa.9ֿM~黻/3]wwѢ_K.Bq:I6+OMFnrs# qy7-j1tKiZ^hMGbq ,O9ZFH?[]r Q}B). V|.uR95%v| P {l5Bלر׸jLm.28M^zL1&*&6%4*GYѪ9E֔ 1cm S;HfM*oꑊ"k,1ĤD] N9X\l{` 3e`Fb׊"I5YTcľOXXc!I3A&TX\b)8#Nv|q)~ >G9+ 5LfB[n:+r]oܵ\ B׽(h+Wl_v'2R*e/7Ǵ 7JquTܛn n6ce5ZӦRZ2ԩab27FjGvAU$PWRB7T&N/ DJu( 8]_%Τsu2[WMEzuP Ǜ#.ҰD =p'( #Vn +k}]\ ˸UHUCb!3QkH:%:>i-nZ"3jˍ3]*YCLٜ[s =Ů)ִ:1CMT ˷ͻF#Gbbky%q)`q(f%1\͹`TB]8RcFk|F p~kFSȱhH{bV8y^7tsU.M9=HـL)=/B6g )'Mn1)MP>j:o.z&0"TsMUx4AyPT`J/G\S\c߹ng躙w.[60ƚ.msi- JjSŽ#t&Hn[g cE5ۖ|L zFxшZ:p-ܑ-NUfu'-Al_ȀpB߭ ޝr뻗~#ǒ8P68BoXa#b9:c=HXe>+&e[ i /"$ĞXlmZ[&?buBG`dQ ^rӷT*~%e}˻/?ȶV--tQFa: &7τ\(|xmNLa7OePr375P [~:^_wٌVf3Vv&`j%y1EfP,Iip$4Z.$b.Rk̞sF-RO.Ww9W6CU&g(E [<\ SR"kvBzFE"U1"Q#S"5.J"aoTVNJSF9ĀUe%1qQ$Y]j /|v೐ {ia)KkW:Xa=rUk4u.fu {Ig7m%uUd{uޖfk2ѿơ}r1n. Xp=~SF¡Yr1BQ¼n6|ZxmVOsI+motQY89Vv^2lw33Ж`&ϔ݌ib -@mb j$[R}SV@FD}&0kS8f鳉Zyl \>+Q|*& &C!R6U\n-ܻ5dmߣ*Y;Ӡ,l⫐LZzA-f-M"b[Xz`ea<1F81 n a>c"^c3<\Tǟ3L%R"㹣1 d"0B(퐇$O&"iEDI&trըcEt#NC9"iȧq136RȼXGjRXdoˣ J2Z]}Tz8[ [ >dc"ce)!c)Ls(Yae A51FQ8'3A+4f21~~mxo;>}pJOˌϸ=eů ;Oɛ$#$5Y" \@!DD ִ{Q&wE.^cFGVFNGc`O-҂jb?eu [Tt9"1Š䁄BKȶOC+»R%5:Rpo{NoNޝ#۟V_R0? d N~>d!br5G9撚-˯Ձg؋٣0hͧXJPdW<`JPL HkJ 8"r&#KyA-}|ZWh0s~ 1wrm=Snn?{W8\k^n|1Uwr+>i;^34$[LQE6 qZ=Swa΀'pf#0W9̴c"be\a7B. j(ݏ 3Ycqfq>g⧴Z O;;c)I81X\j4cE]rM%O=ƀm~;l$׫rE6Oo֏3ާU/ 5 ?nzvɶ#H+kD%SbLcs9hqN /(-r`>E8bivFK0xmW"uQ4HAGx 8LZE6aFgV&o_|? ׋bRRK7L$SA%^hVQSBlL0=Ct7щwοn Jkו2qUć^h_r/[9l퍏_+c%"|Icv1 oߘR2D'ehLۊi)ׄYD!%Pؘߔ)xL_Z~!e-8Crr%/><]yc|Gp3/cٛcdKmy׎kSrSrSrSM AYlV8qYI8iɰTY 8jkB+ɼi׵ҴwqH{ȇO?zI3uYԲ#\v/M28/d\I$ث@1F`őT툊RR\dAkDScy"K&pb0r*Ix}z̙~߳{'[Av❓sk.f_اytZ;=5ӏӄ c>'T?bod^xoBIG`DǛ-\쳘mj31>-qLU[to35L4 ؘVk;QeklMlŹP~3 `kl㎛J#=b+Ij4pw89y1D}ԶU}/^\6/jeJ8(}F>]}_J/%Ӱ.#p'FfX^U\b&BLX e6Ji{( Cu"fU#~0UB=GVH!cpQIr>i?L@c 賀N%7;_<Ԟ#ص,QfYgYEj?0*fCsX>h'EkW9)fd 6 ' OG$ϜA R" & "jMN9֤hK7$Iɀ* >azl1 o%0.*A#6  \$#' ]d^r2@B5hqN>,$(68J(eLZ)DT+P0 m2]yfteԜޝhV<ӌgaɁFR&`x|L:͛O=0-6)6魎.l .BF5MW69֣ŹrdaALk bI\tb"Jw@ 2Eq.6 #1/:,CRsbM(ָtrv051p\=|h![++5P ٿٸo']HC / ^^ۺ>_bEmϛb_ &\Sj-s @g1@bZEGSD(iC. |Ь̨QrhVӻ=oLm_eplL(4# ǯ(tJIڐy@A=d1}mO@ORIB`B`:}o_5+d!ݢp>c -l}sZJL&Qqs ZAUC夭*KA S+}$!6'-OagP׭>F5蛥>{$O;ԗ{jQZ9ݧN>0Άf|gY0itqnU |pY ^ՒXJ:s|:nIH*Z$ Fs13mnl Ul"&BHg/pXE2yQf=5}PJN87؋jyrI8=JyXo>m oFf LPss&D硸^=b\Ds N!gT-zv@}Hߢ D>dcT&xѿ)E% 60 f'Ԛ[S#V Jpv)4#R/8QpxL@ˠ!)!c6ZF5onItDClJj̓ZmhvHrLr#%vj|qi-hS]17Wt\qA"Rʆ$v\䮛j F.5w6b];[ĥ!7t n8j4CwpL.tn8wl4^p,9Ḇw K˜"Ah!Th-"2&Up!pi8[p5U͑vc4xC4RiV.yDY*m$W ?4:k؉BD>\M{ݘv&,v3:,%Z.6`&:5[ _0ZiZ?)~G:X\F\Z.1z5V&9Kzh8֎ {tg~guLe3-hɉcɷj^e7%@ajd |gr-x*1`|^.!sֳXgOtH3Ҽ.x5|VV %n;_lAH) ?+ ~Q,SyQ,TaxmP ]ZO Q6p'@91.i}w䌽7`j Yߪ2կ'f+- Ͽ㯇yCR*(w]4l-.U^WUMe {qh}_kFGHIHtU^ Jsl ͛ڭIww|l|.hŅҊQrIL9i.HAȸ~^+%#ͅ84,ҝ&`( `+Mw]' \iz1dI^CCkc!*ALT Btr,(<Ĕ \ [\sN'tcYr4'1Qqh(HH2m\2N)ɇcS`޿կ ;ԯ{أ_ `d@}on0@yl%1K4C`+0*TQa6*Wjk8O6ᨐ $ї'X"l$>Z7q;XNzyAID^~"SL\P/ss7 %Hח1ʥ/k8)X-fǻi 4} 7b伉C3Uws^o_ڦ~}6TI-]g Уz)c dɵ1A~9Ȃ ~IJU d$m^T-Ϙ1.xWM_..k[1}^73{ Tlr=y&Č{*#=&ӓSq&*9!nȰ,K4SPQ1T W )Mf@kz R}l蘶[d3{*wSQy6ymxe4P<}Xk8[RҡrEpуm\12dU"VJSnLrOVj%ة?€Y1'zMP;-X0EZ,q9IJȌ1${ rLM͊Wlh6zra퐞[ NEw".vJ- D.F8RUD.5ӎ>s+H>8AǘP|ϭ=Iΰp?੦#\8ş' PL32ps5Ͼ}\>k)^]9,_=yY zCuКղL4j`˖yeh'GdJAI8u*iPDz0)׉Eu"%\JX oȻuUUF5՛_޲BE8)uxiM,1T8!8Iu|gZJŨHIG6/K^^4N9d'LJl_TfS_YZ/7[Opm\x|^=7 /MSI?џ3gE>@6Um;AWO_t>IGY p(Sp"&i'V́AS+ QsG:c)g+bGG.I^M`%yktU3,heC=\RChVs$yu ]7ŘQ-ػ6dW Q]Yݳ/Y=}KBVSCJ= ˣ隯.]U]]e%\ - <<[f3xTFAL0ﳼ(*-'>DX! b} #G Pf0FpyqXPOWF=35pCԓBrKMPE Wj%fNpۖb[ddJJ!5!SewiZ̽d=E$f@P!g%!R&)$~hŇcSvן| mBc\\ ^S$$$0G$}n@߂c#B;E)Znͼ"1}(te71m]V2vn]$p MKyCdb.e17uSP"怳dR 1󼇾yBiVϺRVJoi&* e(me~J޾jWO:<dq^i ͙, :k"7I6{kboS\-7 L)5mm2J#h/ -tkFNcy{MݨW$w(vi,I4ե:CDZP;/kT夘5l0&!O5-;e7]MZWeVla!, N2"-$%HN+G l=0Y -0Y [0G,{y>&?]Rp1툈gvxw"w.REwҧŭhiZ\-20Sq^=2A%ҥ2Y/OdP4!Ͻ=C=ڻACJ UC2?]< o+F-S+9)Bϑ;Pg,>Mo0CoBQ q b$׷7)̅r^T?,j|?) Jx~w΋m&o'?5T2>,}}<,W>Dv <3LH峠h= D5Y;x48h"[k H}>͝tcpzN w -L?)&-M8"K m.&OJ*NQQӯt6ABMLoD j9Ek#rG]  E7#Z[e QZ ,z3B}T ?Pe"rMd:+~/o΋bƌ90.y᚜_iT9^; BKʦu:rr!H.ЫSZtYiuD|֦L wFa0ȴp=|,]{Iym`V=h4lmnF`\Lnz^ϙQ }˟#CRɐnYCq;/bzV?2 շ΃Ƶ -ǗQ2۵l?8S3Jul^} ?ҿZPJO_eby{ӑR鶕"NV8Ry1EZ6ZOo=kegfO2C[2Ã|̘ΛI@[\5˕Ĝw⃂^gnȃw@^.lOR ֤g<6=ix3@:z<֌ոS"bSY%/ e3F?׆3HZzFOEs2Ƨ59謗!5!3ۀs4 kyV2cR= y-ĊN Ès&_nLǖfHF1Btsr#4x23+hp JUƃO>F-i}[C+Ht~t;_jnD4h>cH$rcǃm[~t\یP3A$ʼ1p Q_hM}!9kiIٗ>:>^.HSu ~|R⁺oj~IܪX0L16[u5pFN~Lkʘ( `:Usn;-zXl&x s~osOΥB(WD1-䞓Nܡ9TuYndM:zܜۿrjˌ::qm&rXf@c40i ar3) L t8}>;œA8MI>^FKH, BSZۂ .S{V*UҵIwu+p<8=!/^W$CO:p¼}&x3 QFm=je?J}##i]x?D{s7 8s0 +`Cݧ/,ݳHBglpj5@9LVuy{oDcP:ioF b?2hTO[[O:VNnE=HOL뗿7Žt>t:"wo/"HX3ɄP> @ŷDN<@BpZx ՔpI1T4\HJ5rm=玚`Icև!t=C V[Js׷7h7gwo _>#T̛rFP *QnlxGz #m!(=)t1i ZrT;?vƌVo-o+zm˃B˦VB]c۞Z˷/_~3E)*xz!t+o~d=*ȱS"ڴrE` wgTxsV[Gf/AGO~B6%|$=i8o+:{/]P1Y,#YchQTJ}1~@[* T_ͫ;)066-"IY%ט z2{:%Ѓi|]mcH\;Z`9cv it7V[]S0Gw᭩j}N_:ܴ}f}93 86rQkI `< z ht~{~]a|~j駳Mz3uiAh$VCrdyJOWob\tejUE$j.HMNԇ‰P8QNT|Y#8 FA 2)#TSG: HRl.|5]x1*?Ut:bރfv7mʀi]~kOk ]aBN8 F{>Hs26QY{:jy|;Ie"H@? #)Y8iL f(hܭ< +*X抎2Hk K$D6 H]Ž fH0I<" MJK 2WISЭdzM\E̽;&z){64Ӎc TPo _B*>f"'qL%P~@ XNy36RM?`YeEXVK̗&Y;tJu{k mHEOcOZH*ҿ{S ;ڽ1GrsN=ӽr# 7{nn)hI"7,{Ǣt9s^0'-]9[Ba\C`{tdM,0%J¬`N2[bpIFgf9"n;1鴂=E׭tfwFp;$+ ^@u l á @[YveA5Dnqڵt \]"4EmXgWw'\p!B˃ G߃@DD?/|w'ƕkrlMG_?ξ='4g? 8*]sORʭz@̓ yG%wv=.|>XfiJ6;0`JWgO|[DKF8؍|؃gúIdvWӼp߅/S~+o ᐼU8z;C5N(<FNc8E;`$|h={#Xz;Nu&~s>xmtQX/ 7_m7bg|'!.rnD)a,^&K VL*c2K2J h]mo#7+{̇dl6/l6i _V'7Y$˭W/j "O*vBI)][:s"ẉ̂R"A_q1@a8 i' PsJ]&6uf̜vbW!7ÊqPVVR\3k>%=yLOՄOӆOJUp/,GrU+N8 3y8r+s)'[j0Up4UܤϓOWU>1%pu߭ڀ%&-/&Q.{f*=?݌?o71B>([<`j:~xi7֍ /w/_̚2~nTFXVg ~+tƓ‚< n?]4qL_y\|> M6Ԅc^s<&L3NZɍ.q(kQ4ЪN&RNw.̏BhF>[y4nQ O2f$ţ޺*#>IkoJW0ZFm8GS 7+flx& $+Dƍ%㻨4$'d2}Gr+?RlH<J/+m_we>͆XO'1u9xMPԧ縁͏+iF3{4:KՇqGt6 I1Э ȕd|>5MKs~¾߱g0,$+300F َ>nϊRbZJo-G3I#;9g?'ҁ%oW Ju^h>Mm/D4kc(fRơٗՔtk|Ԥ8hNmJ|\UU_=CǦI/S3Q kbRW$ pVZARf@˩[kZPpZnC5PŹM&Z Maw 2J/B[dw{Av{7DkpM14Y"(Nj]]P-5y.e\KvQ-n|toZezK#)'F?ݹHij夜a҇5^}Ov7DzE ulfܿv5EUh$_*\eo ϟ9^;^[ {jg碙) ݟ^tQ/$P0̈́^$(i9a 0}OG,Fȓ&5sP<gsЎGIѮvhv%:jB'KZa(i[9N$iw wlg{ @vI0wD~Qt ?ˣkp WQTvW)j_IT Q$!Mz.&s?ݹT= V~YRw*kRQNŖ *d SǤJŖ"kK+lTq9~=D;֊ohF`5Y\;E P{#K;~57$n+bJCWD`V1:U3Z=3M,qULxUqypu:$Bj"8iZXnEJ45`S rSU㰸AjJH.4 ;ƻ}$ж3.'<DF*VRʯ (P) ęIKVmnz%ʔULPIŵ\c7 N[-!jg8π(jϪRNERG,ZV.@(rK qMQ*ӗ TIhV5/:5khv'"OZk)92j <`JSg  ץPU9ÜE #ϓ9a:y#~_ b8OW'8!_8>:OGq#6*wnZi2E>}~2i7vK]㺚>#Lc׋o}[^?NϞT-ZWUNӯΧh9tu&(QUѺ eu;b:-Jm$Z&f;T6̩H9}ѕ%?c5I5 =ѕќ7uT+Y9kA#/Aj /lee"+$mDM(>=Zj^=צ '\N3Ar<ϵQ۪K6) n\ӈ@ĵ#ԽZ6NOJGW!xI76 }9߰`N4{CxޘKWQBdIw_ UsUrIo]nծn'Y]Hf  /V8@eM,ŠD W4T_/ُ^w;uw}Cd0Sez/r?a ~z+Ċ7bMbTcT F)b9~,'4s: f4#/tpv5ݪ b58o3aQçBvŬr :_QLEl"̈)Ⱦqi8`+_ychK`_2@Eh'2*@s7R3 #غ-q&s{?fphו G:#&s ])yaB~v2=2ֹ }[M@_TGK-DLWg~.Ds[/i68 wgn\|0?'3e[we+4oSi+ߚӇSV @@&d5ZX~)-q%ԁR`p[IzG}|+LՋ}3 絆{u[5O-ܥrUs}B ^~/z=Kr=ӝkrMsQtTj]uշ}#'uƀsC<эwUB%0DUI t汴T\~Mu9A S&XDHb}S&.xrk -=acKN ENw@560nk}FJ_(F!8[Ww{Nś/ZՌuHApIR5␍_7~ĆYIsYls%o-E:[.7_Niÿǹ9<)K2o0[ՁhKi)\ڕJ' ] >\0Vm^ϱ֢$%E,(bIDQ/HA+XcVR@ň&@QRJ ­gAD3*4tŁZ%YhiSA+2I85F8K8E)7q8ja&L`kuvFҤmҿM^Jۧe,ΐ|x%'.м8bS\,qo6/gwg+]\xގI~o|o7G&bҘ7g_<`i#8]|lv 鼑ok*6迮 l>=X1Wd? !xgBf"+JPrhX5~ % 4kݛ7(eESi %$ֺ7j%[TG%EEu>_Oee$e#ʟ{$ZPTWiO$} !HF;ِڔ̙xxiS~{83R~ 9wGazxr`^ ЕobS~SR?}<-MK4UU/争zJ:`D;թ+{~9q5pԉ/'uo'MK\v{WIJ|QIѭu$%~JMpQv,x)ʌ[&@ 2?-%0)g:M%£P)R)W ,@d \ZRVaI(3ђʍ8=g V`+' 㕭4hR9qp8.@')k\ڈh%8brh}-loݷ)z _hh|(㔎Ecq'4]z&y-Y?=}kwa՟5ʂТwgdgns-n_;v2q1G7;w ~~#hCNp쀃y?u)J4f?$ 7e*C aj.+]`D{Zp'Xa0(Q cvmvZ-4 R#\ &d{ DH6OfE [XBMC$*A=>H~=~z/~rǚĝ-8Zi2xWn7-bd~U#3=1R̀_߅Ԍ {g) wa$5bM,ܐnx'6#}ut|&XfE,]ѺcOJ&v:懯˵e_ɂy[{cOﳓѾ BciWbȇL'[SEadשT Q0? [(bx'uyk-TBZW@d "Z]^[j -" \mAPZ*i`uR/4ZJV,x[i^Sf6edS&Hu{0\:+4dqAB9%l`U}<)-F.' \=fqtu4M aЄ"R8Dnx  Wt\.e"ׇ7rGrhP#z a}ѻk@:qR3=z BHpԶB b~"2`Ds!hCj'n*Y%IwuՂ҉ġA͠R .z|u-S ORU{vFRtQ]}9FP'w_'~۲cpWo>fW!MSXLX9>apKbI2=L+*$n=fW/敤J.N73'q3 8Smf3ix-ՔL=2&-[rL[y!gg#cʵήO&MdddRɖ}nl"K# j7 ?j^(Ѭbm.__ ک i ń{E>_2o{ L()Qh 6t)vN% je,”NRC+_02&] 2go_A:%kI Т]^Z4_ZhFp\v]}"RE#kWJ#JΓkii$eISՂK@VVgBW?q¼@=wx2b}A53+ex'nWyI;߻c(ENXJPiVĖZ1-=J\xYUYFpMDКUxI7o ѻY!e̛2Fiq))X,Z,T{ﭦ%Uΐ54k-d(<<T&ZtoކX 'q!ppNڒ)3+!3*-0_-NԸ+\\'5Kn'\" Ǔ%DEԠa$0FR„'`ѐ7H< ZM\gbd@ TDq9C*kӄ>Z(Xn,3wxx@+٢3۳8"g*sDyJ1Ar2xs&ULGc!O,_gV|5ߓ:;FaܢY~]-+DzJx[1f5'ikA HX&ί^ԣ9zSddtT$ۈo[t[z.)$zPTaJcҡأƀ ɞu;+6\:8604 `V]88(yjDc\jsY"kh`fx0.:( >n+&"^1CizA΋ j>z\# V;,_7UJ&R}arLѠ3ZL=p@~ּF0}ty+0]cbCn½ VVJ+/JJ)8e,0[,H mc`cۥtgOˡZԪǐ$N<u,dTS]6}ߊ2HDv mܽc=\'+)'U|~ ߪM|Uc#Lxmץێ[E,aR^7hOgrFz{Rt}KģP|7.$ ՛d[,SV.{jE ?"Mx[,eB)$DS1O):Z#&ڨW7Wp%@&vV]Ӄ; 6␡CNhm%'ͯ* Gfs|Oҥ3r CoJ4h ;i䴤PTvuXӪ7k bA~ct? `XK Kyʈ g)'A"Lqޥ1Lݛjqm] B J]]Vdk GB6K$] 9ǖD+βP)oYEp>`8RxFJ{RDlB#lG"jY>KtƢ7!7|FnLNLSS{oIѢK~~2A/{k?ͻFs Ϩ~)F]5πZz[HMy3ǀ;~2tía)Kt0&#&b82Qoi6e۫iTN}1+ux5 Z)kk8*'b4Mx9"jF:+Z,VpB"_Ն!w`o 3f >JuCrCkJL-jqؕd}s=ҡ_ b-<0`u'!T-"h]N80hmpq=w+ۑ7k.3ܰUm]ͪSYu58SwҎ٩;,^Άmzt䗟~~h\ Q%-*ŋO7Y*AXϚi 7~ngt尷Do1pw_~:K3 0N,LpVN0Dܙmj)wyE^-/Z$b{Q >/V/7KzrGG<㨘)m8 plhZQqANcBS(=y01{d?n2d\IUhIa s+I[Ĕfqƒ62TΤO{R~+qME9" 'y󧾊eRܨ0?F [$gg5H/>/3Ok{Źl^VL;%COkPkSsCFw6NQ@G&2mS9B9ns?h=77OaʮG&bd'ki?VX!>k])d6Y=ِ)+nQ`܎h®͈Ы?MDt4?Vl[]+L@F࿣xZ $tt5HzP#fwR #z'6|A6X*WVp%hQ )D)%MQ1O$_! ED!Dt8t+3HdlCҏs?Q)R-aF0  ѯ??b|ɆOwwſ3wf6v<nӟyX"d#UE6̒ӣ1ARDZH]~7s{!8VTc)Fǫ(Su?7| ~nLka,N> #ϑr&-rA&g2NCu.\ROqQ^%[TMFΧ7}<_TA7 Ub2Ban`2KbAgƩkz+ߋw=l$wqFk홡\H%E079" PAim&% d.#X &x 8[w`Xg8s+bV?`'A xJpr3o`-E<̰ȃ}9`Ijrxy`Yvr2 ԰[JD8 ΩąWkc[lhR|bZ2r-`;B܄<{p 3,s`B$Uic硴ӁZ5 SiΫ۴ 3mSy)R`򚶩XaXJ0MhaHi٭e8LhHh-/g"g?ߜ6ZrZkPk&o5NBdZyDbQ0b~ ln~ui|zO';?|g%]bj狷E>V-[(߶].V≷Q3.6F)P񌸱i#:#V*{Κ+ 5@3Q9a#R !}Q;)"Qd) WFj:ö^h/F@7#cevw.v]7iZ@Fz3dޥ%ὣ!C 뺬wgIe-4;sYH,XaUq"`{N J oQ/rh㻇{PqnҀ,!8 RWX9D3<>yη2^eaLi+z)")IͅXPq40s|5J-3\ J*HO0fFHE(%*`@K2D2C 9fk؁}~lU[dy,Plzm|oEO98N>8q=2 IJ=-7 --dx~D1cߓ<~Qj|N ""C$+~,\|) ~Yo_;3\RyXBd#UE>PcrkL,FQB/_cK[Le*Mқ;3>s>Yב$j{״Lpf~W^=wLò2<ΘF&%B!/Kf(ϬDqB#T *a,Ւ8 Lj *x Abg(X%tr s4㘗X fXkP9C[|"k[0Ԏ^jo,r+3q{zicv:m㦁%TH*޳||W!$:_1hK È3b?]kѩu ڠ}\\Lf D`ֳb~Di1&h$f̂l@pN"J PjuH=ur"G9 \B\VBpt/QD`b*ڂ.W΢XDVVVtuMYkUl; !; ї RɼFž 3!>41_&ŗL2^ Y4WûħJ3>Q<^|b^R0&7;Ps]y}ݛ7J҃S|lGA)(Es5jr WP"5Hl% #n&[quvYYߵ)/.a,;IW\"5zDv'3;?&_DMSϯ01X-;DLy}R}ӮiH|xct.xV8Kd5r?f/GnƓ;F|ߣ'hو[DJ骣;tO& QM-t4W?7.׳Ȯ"nѻȮ>rՆwsխ{b58Beri=b{gshmG dP2햶 =KHRy n0*y{&5Q$d;xERQHԼ/k A9݃Y<0̍c`A~pgnAGÅ.Vu `|~!":!RfN;iV }[ZU͛ЪIE1i/K$5޽M>Δ62Vqeh;KC|jXJs]Gs\9\w޳~Ԝ )rh}hTHkLաz("f52P(+E$ jeUk9DwiuCL)b_ 1N xuwVB[^Jݮy䓄vw! 0CMJ bLn bHw+9kIn9kT!-mIbi-rw Bb3fv3`j`YN[Ψu'.[nKdJQRM&[.8uΎOy]wv:t[>vm?`NaQk3ӡwˆl-*Қml-,lu'T_1u8ke39p3>O H"'?6j>iX8U2:άԍR[k7JJ.QMO$M']M`!%XOwikv P F#cZxPC_n3u&P\7eo~'mQ7bÁ,Gi>iKT]?;IGϧ F57~1Np$pLr]Z(p$NwN SLpSwɡӑ^Z6wgxpryL|f(*brmc~tb _GOINMEj2ں9]xC.Tb>#il1WA:`Ùz\LEp4'oWp(usG.npx˞Yw mMZ*wchQErn 4SV!b3e "T\Q̂1*4!VkO59, 3cac℀>80)bg\I!Hԡq~Mus<ӝanl?ۻ/]>B3 _~~M>bHU 1x `> _v "@5zd:[Ψ[CP_i%c>3$X(vtB; x5k)%0^a4ĶLa̩"BEw|Bw렔rB(3J{ZԁTv;j)k)tjrW,B7k\NjWyt?\[&,BZ("bPn'I~!oL=TPh$a XSz7` kSD7j&8R2tr.5mTH'h> (qNJ=3$ (5 MsR ޵ ?]]57y۹a0Z ]̽~|{H/.ʎ[*Λ &W}߾с?nG'ӷIҷ wR Jc!aw8.:ƅˆ\HȌb8&+ҿ]1 Lʽ E1Fch $KI 0Z1Į& ۴bFl`(l&]6~2*`aoaE%X>^b&[-xfE6nFO먛Ĺ<֣s-Fb< ט8X\\DuKP*Ʒ &-`?W~ K*<3 iE2[3$M@oj]5Rd쪑ɛcIbB,tyRЦÞTYOIeux{9]Eʎj<>͡jdP+Bm?5spp:G R _pݍ 8^?$jrRgnpG9B+v'!BX>E{yD[ B `SzPW!^6?eF4fڸiv/[ܠ*vI6 :}[}D\)վ;ڊ.~ K'F2hde{J 3*MTb I,KREETUg!LUM-m78د6ءs(/sh #骻abEǿE /ɴx]9|z;m\п؟&sJKD#^y+AuT(:GwYC.9 Cɰ]9Sh/ݧwNjբ9O!m=,$/js< >7WKdnՁ e[kē]~mb8=wED6LW,T+գzK@elt]\D5\`)WÆs0RiZXBWR#,u_WA&TA~řZ^}I1]hct9ַ 1TUM"Y r.?(By.Hʹ`iWHf|]IEɌ(%.Q{N_[>K_u`|N^) `F*a_m8CcDqTb_oMV*ܗ~g|6]m2m `?]2h(;jbk4RΡgDV {CiA>B#Ҧh=e{OQm9~<2l${f~%HŘT{vuG(us}Hs1jSscAene7svMk<9pOd?,Q=u.frq7H. .pxw)UOXG&_d*&ekt5/!~\7&NIʘNYW6z<{J5n8=IL[, @kq:Rcdd}N4Q;xtܣ͞ƒ˗z}S*۽}D?#Uhk^$ *kmoLb7ҥ!Sϱe"(H Q|+d"YE0N$s 4 Њ%hmB `d"4Id' %$ME ,Ujl‰1JbCRi& -miU$rT;$IJ%%.=\!Aʍ>SI|"w%IG X,Q,T[XO0%A*i% rLl|B×yKyA8Sn`ԫN\YMWE:v(O1ZGPY8lY7`Z#Eg0AKO\Vk IJ\NxFH!;!`xcĚL-)(SB)J$sK2f@0"&JcYsBN7wfFl0,,%S7y\;!8F,5DK#2*yVi(d;/ ÔH}@ #L93əj}4KXD-E$H?dI5)@Y@A0mַ#\Ppif.0S4,$RQ.`嗢\o<?¬K׃zzC?ev.i?t0ִ>N/K-DZ}\$բ;͈}"]Oa2BxG!y,yt:n{ѥJҽk ѿ:+`?93m8@w~wvΘ--JC[_Gׯyp;?6?:~x,a,Ͱ*tNmq.n'%v.Q ]iBIw6Jt`iBIr) Cl\B@ Xć'lY `)XPU~T*!NsJA kgQT$ϵN5lH989"&Iz#umV*\$jl rPZxWm Ǐnx ,S:zyLD@O&}=x7˞ xl۠i) juW^63BVJ>Ci0xYw1a\Л]0drٝx̺u.;;vNmwig4k/FcgqWM-g}ǜ07atٯhZCOU1/)IU3XUWς a:?AYF?  9(Krg>U$!\Dcd l[=ڍbGnwnGd)ƹD[G\PU!!\DcdJ}{t1p%҆Hg~?n&WI} ]5Vpt`44Q\[~g_鰯?׸N@îzl&fI9.xm9#}b2Ŏۯ(SwBr-ja}J̱$u5W;3jA)X-icչKo+qV27lvbKia~}Aw׷iU(կXFL1i.-9aT=І*ǚCd>?p/.Br/_cq߲4S-lɭ'-%s p;ylzzq%R\sZӍ3p;Uժ(HI֪5^}O67HhxEr΄FBDMIi4sjEuXk=6Z '>㹋ZB{Ұ4_&]ܪ !IՓ] $YE ,uԦvDhȎI :^xe;kA V{T UՍJB苮¶?yCP~US-Ñn/V)$u;+ftTS&.%*0*h8> ?]]Յ:Mgdnfdnw2x~a𛕋m+8_\BɃvXH;o@N0 oa8>+n;bC!j IdY K x~ [X3>͊x][o9+_{&Yb]`3rohV=?ՒlE-QLVYX (@I~p崱!gAU si`R0|д=0ڝ0l8:@=1l(aj]3 Ja>LZDŽ1pZJ3pv"Tw71" XGͨt[ZjҲaqYp VV}.w Ja={ WbZy!m$ZEx6Cpap;X$__܏?D;~FȎ},B,hZ H**DhWnNv!h]S/K{+yKT%P(o sJ ;IrR5tk |ロ{GVo7h_<8.hرN?d'WZ }UU&]x;giӪ|'/Zn Cw Kku/]S񢭲)R,3ԹHL:`.XY2BG4s84qNhi=Pu7=iɾh=ݖN6(DKi4~pRPfemiipzfQ̡JF J-N:2Zb#cյZܗ*E"Gb8=2eD#JGa%/~ ujW{<̻W߆REuz paâzGKQWu~Ug\/PExE+=xtsR1+Wxuu幐$R Вh%€Dzň:Vi8ln?ΦHje"3lDڕY% 422YJj(Xi!X eDhJ[CPVOvKWxTX/_>20Jktm{QE' dRk8Ua?Pz~G;"ZcQ:ytRdo*yU OWuz2CcУwsQJ+4HJXl!i)4&6W1 oy MO-l$t;Ԭ\1dVhDjNX_gQXs)ߛpA/m+@SUYo$@So--5RIAlI Nb,JVGWXbP,X2fA3JNaMϖsjq r(A fy11bk f82M-m} .{ vKa쵒v4%v|l6ZYZ#s<.CE?f&\R!6]ޙ*P 0L_2Ț y`T 3DA6E[Zv~trEU0l9? \咠Ӫ:Ǭ9Ysj>H-;U(~>Xx7%O IpS),+A{&:f7:@D{e,۠%R =p7{;񣟆юo__0?}fn`/SZy5[M8N&ڐ ۃDKN9CKqlx)n53gP҆$HCE;].vߌw;Nו^0v4܏g٬ RD 8NæSMWg'=/M x@:  nI :.сA pVkKʝ+hZXGJMME9zjC#>%/YdΛpř NS#QhqH04p#R^恉XS#X. Ɉq(kޛ,CbeIZ ɂ`åucYF4G$'ujI8 /t| M4Pm6Ғ"\θH{qW81\HĎ.P8FgtJЛ7w,AMUv}TxiңV %)+]UlED]*;,SR]PBN?;.׊G6}]`IpO"qCD (nyU,6-7Du+IѣdZzÅ-1-;ЈfѭK(kbhDqc `(9!yNDIEr^h\DsY9 ~I`N(#F%1m)'AFĭWUbxOPYbq)SֻvNlFxwBJZh\VnnUñ-e,Ci@/T7I eၗ#Kչ(ԍ!dgYzıRYqY2LZ{neii&UD#: c CKnWyqd\w faI)n1#SPT; *,߁ėT7}q(%8 HM#hq4M%:TKbv2Q&Ұ(?\ҝަB& Ӣ'i-δA?UHb~zx'4g$gˠHwG^aRN} IYnO/i_&:^tE_'7ǫv@ržj¢e}$f+# (% dd1BF[AҒ k,ǖ 9wfȓpxCB*륒%-KUp•UY,WRYj4:k2E+qY*7]=?L%GVʞ]*ҊFI:[<3h9]4IYaM#LF~4ô}`5hɴIrɹ VdGRp:VgV|z^<;8RU3V &|sI,zinvgo_0\Xk)_Uɧ= P3@%?g;omۏi)DgH?fPM馷5Kڶtb]]r3뉪MmWA^{jڝfwF/QIj|-O8b5[j—?E;ysy( C~m>ܻ <þs4r갋Q/?6L/uiGd{$g;bO2_Of2#*-s(P]stf#>e?Ǥ4mr*o%CNFf5<;h蛅[y*B_[f<߄]=8ƛivYS炩G }z4x*PR1%XHǤ@z KPBծ9cب]ٯQгGiļF5<;h蛅z޷اφS2PopLgCȷIwS[TӯӻHIC/6UNyGs ϸ)35|~aP@K{:᪦7qĐ e;Il#T"BݜLݰT^{pa(uZIo2(xL~UK9($d"Z#p֟TvC NzZKaKo"&'eOf2O I 1;aӽo:dQ) }gokİ3 ]#h^Z0F^H֯ G L|)QqmHUVsNfwbGg.O&zag(.ġ9gm8k!z\Op pϳjO#U[_gAXO<yX+a鯿e3yP1xIPNFض9¯.Lr=*<I<\kDLa £3ҽYnFsJc~t܌5[-+ "索gӊ7 NQ1̅} en|@C;[/(1 $.:ڗauSp m^G*cb\=m1趮 #AӁj4td@k(J41.GQ P4? kgAdK@6Bn$At,qw$tjŚA Aմ+60DJ$2ĺ $A1l49"n6qTKzLI8ZT̡'X$i─֥HCw ZDRN}5}2Rb㩝GAS1I@e1c):Im (ZQCSp1[sjxh ekWr 8ٛ-8i03S!%8[!mz>"PdMZbu׮I#;tKͧSB 9vƚv[״+CRǭ&dg\PU]n|M=8et9Rm&oD/6Z]͸z^vmKy}ϚsfOۋ &q! *X5šUcYK:ɾ-~jWPZQLFX42xEEC"2 N˯^eDT)xjTj2!E 0 A>Uf4^v(U_-&Hp]ԍ+.$o1c9]#.i@,៟h4_y$*IFjJf2 ,C_nJ Pbo+èxBbAn-**F衲\2:p)!1w:Dp&Y|vUV4f9|2i#2:kZڕ4L?Wm4K'e:SY!hўw<!(Aŷ6B.twՅ`NfoQr31C+Ew¼5OB-]>#ug>|;fhͱ ΂(1bZ97{w*5Κtr.y˥#DZrݛu^՗kyհZ^5>y՗WT}1Iժl).Fs آh7tVv/Q[;h I(v.:].:r3Z(Qk)?1co7K`I,g Q pU*UUkisٮ"š 1iAWs%Spd.˚u (^V@iB丢LaK<Ѭ mQ)1xdxieL(IċgԚ4dLxrlfXRu;0,*]`C[Aٕ~]-^ka"&]$+WWr?s.TdK*?>}|zƵz~_~76TDXTL &?$/﨩~Ï-ߞ]2Tɭ],>ysRRy {|XGM m@jV@N!C۔K+ i;N*zLGрxui0P@_ܖ fLl3! (fd͎)z}wUx\ڇ0:$yc,;0ٯb4l2Uߥ:|[ㇱl L*,T do>Zم_{x-b&PPp[A; O^ll%.~9vx[QN`&HiMM%7IDe1fӷ|E+3UX:E@:Ÿ-zk){qxu(1 kܥKZ@Z҂eD avÀg[?6CZZ! atojG z6+Eƣ" Nĩ=J!x-t2@$uer"q#!4h)Ͳ$ D^/|3ߊ'G5͆otp킩6yx >8ʅb޵5m$˞.T'[I6{슓KTt$+RIR$HxrY" b_e,rM"]ttL>"-quaRr-@|z'QBb``}^}YP; K"juP>䅄$8 U@4vKnV?+ytVmY39ctok`}UzaJB-Oce"l]_?{ DlIWrq𲭤Ҕ/TUU aGۇՂbeP8YzV#q *pWJ1k^>+֓ڞa6}=TMF|7)XJЪmR?FFsͶw7ksE:D.urO!Ǹ!'4!,xVnzxtk/ gboP):`lּy0F@яЅRi+杕zb]5έ3reYaRi-6UƍZ|ZW*p#dR+;2&1F}t @-8ٛ—l%hr /us454qtN@wT ZKѵ؄3xI@nE} D ^$p7g9 ;Ͼ|oNN΢u* z3Nh$m_կ|CMR;;!3TFJ($8O#(:Y*Me*NpW8aG"F KcgFjDŽV̼"`uP̸̓qH!7|t^XT^fh;JHRXF%z[ƍ Vܩ\DNm: 3Ib+f"pu,b+औcfj: cu*0/G%i| 6gUB bIIi -Z.W'{zqD֢ N0#V(IE)9^0ͣlpukO|RY-2{ ?כ_=w3gYūs<,'J? v$ې:ן䐰C ~ n'9C]]!e')߹鷣hwnx72ڐ!ۃS)g8Y7\Q{81@i‘IA, ݋Q"vQkԉh P2\;Črf#AK@D$ȅ Th=i4'$'>1axnǒ`$Ť;^w0&`u?'w.\v49{5ֳg}?9'#/ #p@ʁG]/##I/zs3ƒFJ^DY'r!GG|߈/CQm( }ToA$I҅A},Zʹ0<-W N[IS)[W%5BxH1@??U1Θ&(j2AH77U(t&64C1"D)C'>&s1<RyS$Lބ*H޴F3Z*oi?V*@ %B *yh 'ESy{h)1^/)oZxYLaUnA.w$-Fg{Fʤ6 3&:`ɤ|M4q⽯[ R_nW7*1jĜÅ [ og8%(0(op (g μV@' TT= -a&yvyI"qjUcTbԿg :dk>a8/Z#LjJ,N?J!cmMᩕ&cXޅl>E 5z@RB5ڂLi,u`IMR2))Гj `)iW& 1j,aTOv.QIzx!+=+OopX%|]qs*L1@W{ɷ4 fkX@xO\G5c!PG9j"Mm&*΋i?{X{ p?7Lx֌fhֻQO>Ƴ*ٗ۫;{хx'tM*U~XN*>-G%%onT2xYW|f1jl]V0ɥ;BN8k]eeaV!j1x0x!Лe;;o#ZУ磋Br榏S?WpV+3;QI{~\[+rA  Am(V pyX+kf)|w1oFhiN ӐAC I&|V<1\| Twm6PbfTTk)4LmN-_oMʴԚuٰc3.RЪ},4 J^]ަwRFIh²O:q.l7?z >0>Q!I2 ^~0(@Ma`gi ɿ.3{fM1ۇj$ 5wޫ nBz0(S]_9,d1 -uJ{ )c#Ir~rZׇo&~ |8|n`|3o_OΖS 3cc(Otno$ tR\L{'<[ϣ~R/g>~391nCLzCsh5Y9;oۑpm!SG:OPbPGtڎD0zE.US_Aݺ.mdJ-1?.gBZ=McM8ڄu3AԔ)pK }XrnܔX.Np;WׂdT))e{3k Cu/-lUG uYzLlCkZ/w ʛ aЯ@)@ΚgMs=7"ӫ0I !VO2.gMD&UɃ<1;=A:hp[볝~O1Nfn v0m}ћxfVY9]m OX~ qǗQy 1G1^:/V\v/:Ѓ:>|+1@hN%WP"(S; 9J;cĊY?/ү3wF9<槓 /YL1*V2 L L L ,2p~w;J(4P_ݒ[1d 2¬\KnLjEyFii<3"mpt6'8$oM?卺o=l&Joɚow̭XWRpd6"R&Pb &1j9ypAtɮ&qBաƵ̀U !ڙ,#T;tJ#'Qh9L[` ƤZ]^Z#>Au&_3X\!=M ^|Vv{_ZPn.C|r[O,UoAjOaq+*hp?lLRJnvv/_.5X}TjkPDI$E8dm "_7~`9!W_z9nSī a㛟uulP|P~m$W:Z_o$,:f0 h]/cxGFHb7E[ُ'l3kDn["a6Q-^>L/N2>2:[Z:68?8zl;{w3ВJ2 .|g9vT<۽ؚr?W`NEӧWH<u{z`{Nye_iC8t aE,qb,q| !-%d49'F80fGͰ+M8}jd(Tn0Eӧ]nf0LȜK-usS.8̂-ϴEZt#&]1ldG ]QW|t 6 &8{hTˤHZG8IUjYy{,М)i <)ģX*`p M)u'MÄ1$ @7&nDAtDGq}* kMʕe6si'2\[ڔQN=nSM┣)b'fb*'<)63A %s PdN)M5ĩ&P+.(,;4\f^5H]b{wxbPX5f\Kqi:O=SnOg~ON_w"1xiRQ{8*;NϞܲ\_htOR#ZMw y+ӿ_BUV܆{֬/nU"apGh O)FF[I7J=h(b1a9H3j@S%n@|,SKa&yNODK Nx:< B`V!{o߿hY仳Ϗff=v 19ve rg{=Ȯj6yOpDr92{tY_AқHV`/ڌQ%?P;R cRRcHeq(rTH. _qd:LO>`Φq( h } Ddd̗R,vLYev)=E7uVE<8c O,iY|Mqw;ս/u)%2udn#TM%?~0/3T_<> -˙`1Ovl> :[< #/YkBj *; ~!C=lteixMҲcbw*i?\j'h"vUWVyjmѓ4 %Nd6aȰD;N0e4ϴ!i.yfQpה azW8OANcj',,1dcP:YJf=rUB&J#𐇢ed٨V3#B;"\=.m~!Rȡ YuP\dcҳށĨ/ H 9u%h8z4y~e1M5XUnkN}ö ] )AiofaUC.@fNa+\_%Bf_"zd(2K`_rr!JNWmb}vpu 8>"uofN0Cf7-83EF#c3YkTy""3 /EkDRP]d+Bkm'/fe7gޢI7~R v|Z5Irr~Ϟ>{s6OS vI:Jr Yf%RR`N#4ː.3NoEN>t"tnXDMPv=/iwI,zp2j+|z." +}ZWvb yT8/0LǨ>*Yy}[. $S8 ^d 7yz^͟WwWu+W &Z; Qd@}^Fŵya.4 H(oC 9VIAm#ǰ?P 2!8s\r;ߍCŒ#d8Α%p1œ6HU˔Z*grjrQIJJ)}[N)A ]TaT$,CZ8$ދAJ]R,׹ `X*R\L(J8#{ؾeȧ fs^VKUO>0K}q=ɞמ>/߽|XT Ҁ0fe0 ?#]aX6(}wgmO;f˦xw6|l*/'1kUO3q[ ̲ 3t &f|͝e|2Йo'O/￞1E+y^.nfep% }~# )Fc %Tg%'>a^O|ɢZLtcxUTbj>&Өhn#KޕY.S_fKXщE&EGt:~dM<6ϜZ;D ҔL:e..c3*9riN@Sk4jXqa6"r+h{Uz\HFOb+x 4,`fg 3\ ֫GoMu֫Z-*;7iySAKö,v=E _mA>QH b`7|Va**eN-fBiiۤdb/(Bs6;¹"xhտ])c]E]ۇõ1T5x;"f4"vP,Ͼ!P9nA^t]b!,S},bEI`0dcMCjܟ0N(Q#is')z5 rh+;>R0긾KҜ⨤K-e6lL9.0`و)Qg?/2h+բ$oiJJ|_O~$']uTд\&׷јi2)nzB0D@ɷ:nhԬ\k}:Rps)ݛ$O.r9xlo[1Lhr"ERh -C͵Iu p'<坢%jOmq*tacYsf8eYeb)3X@5Zb1ˠaAKȒa/e_+y򄺒M6/X61!\+iY(1`c9`Um̌e!)X~B ü\E錁uu P)@ pש^= ThV4dJRĠl!xP&8c Kc _8QLAIu@B0#pˌDƤ""uq$Su|j&S,+XLXUK1z[nfbU*I|~ Pzy P+k[<%؟DW.9 d}sKs W{G4X¶6pfl~%߳b@~O#P{zf4:Eevjipz[P,Xݻi2F́ {<ƁA%|x1Dlw*=Fg0-3#>icf=|Qvܷ= >#~Qž? M<7AC_z_W2QuϲZ|[|,ףzrBLCS5\b4em+ݏ# v]Vo;ŇFTwyŰ| q]1$մh;k-@?6cǂ,7\!7^}(+#OlQe>]1ݺ]m͙6(i3H3bv{tBUz@- 勝}iCTg>i7'V-X}إ f1 `ˈ-̕V,s/(`QdHA4Cq(rsZs_ h-D M&q4B,O >hvp-ѩiZg3@`ꝩw, f@ELTL\Tj8*Zk#T~J>@1` "_!:R#9~hB+J$Gg!ڲYΊRDÑ CREDe_fX&Un]0HFt8qtUN(xT(e|7T1ďD3F99'U%Qa-YpeBy$]rn$ڭ9wX ZHEI*dݐg\w{ ځ\1rf݅,6OCSC}8yH > !hrwNnmACdoa ; 9զKǟIw~@+ή__;=ys3\xs[ݫSiuJo`0oQ̨]<o`F-\hH"ΡI!1s9xBt]ujyp&1 3{-!CjsK)R0uV+28S˃MΘ~po=uqdPono 8WV ?vj^|6,w) |mPlm6CU~t" 㠤~_VCYW}tA +7q5*gj]_;0!J)-br"%a,]gշDGS IpԆJV+\kl~bM; /_a|:= )c" ,:H΢<$kQh NaA"bRTB>ֺSN\t\Vjs-"H3JL+B?Ί$#A3Y@ xJ< E@f6:ǵ_痏>nԻSOn{]>`\a3k08BM2ˍ8nҀSLsVgG싢#'6܁ /L,/FӔHsQpNԩ O\1ANJVPգ+s ;(6׊G%8D 讥h9pV*8+᫋wzR#06 s]:=NZU{q=ԨMxmC.$F߶rd h>Hyv=^VDЋzNюQ}$I*RN  dF[׉=Wj~KPQr,D!XsidP9~V; 32C8ccQ~ą#>#L;vDi)3X9 ڕh|q`l zu{k17?|Hknw'BrtCNA'-2F'S-jc:+{[Pԗ}pJLwl{S>ղX/mJ5/ǃmT_ ^yQgtCmJx70^ e^Ӭ (hɞXjaH՗jn>{CI ^,@8#mds@RӾB|t`C JKKKԴ.QSKǔ)ui48G%/@9cE`Ls*Rg$6zp#*L%Qs6pBSM[= R!nkt_TL_AXjpAa\2}RL.rE GƖd3p1  C Ȅ n. -2e<(F*d8p <0e['- iJ&lh Ci6H|zjTj|6wWw.^}ܬ̟xeH S()*]]_S^g|!4d:RZ<\Δh tf,IB$R D31`d'- j*mlyTV9ټz'$~K^! g291FQݕdL7Nm !?n?έF\嫯d¨ 5F_^'^)( <9roNW?}@F٧l<[,_e}]qCt"ֶ&@:pG{]*+0,KG^qY"t v%z]D[jHh>7k%Vhzib>r߂F$u/>qsz Ribj"S[8^u/~RJs! U+[I 8hU+ӯEYCIB1ZJGqL ~@PթVGby& 5BUhՇETdzR(s&T/>͵ Wc$nB;[xy U {( Ѝ0 ade[7ږU@uMhV>*[*"9kTZ!cʄkςxeJKZ5BhE4\)@ZyqNZy |"E?ukiyss "fE0Ž{{tRn,Fvc-EX7ccKK^1zvP=UGEGӼDQX+2 c -Jogg77ֽ\EeM_=:;VͫDEt2 N44A&ל ZA_L~z.>,Ux GG1dȍIuR}QY]s4nAeã3ǸF"- $T y{ L۽onf5MMJEL;;Ehj!v.E%~=BC$](duK'nZK#FlOޭkAArBr`b^Eoh ~twW;zOj-nEq67uq |۱BaD^: 1A-5afRu\iK{Ti&q{_ϊ6- xτm!ˊ8c )JtK[R3;V鱰ŒPPB؂yy33f'eYg$^)_2YWڝ$0gC!.uXqٌ;b U`:d#&:R ;L.A|7)E>K;+wVXf%XJPdf\K-qLQ%$sPr`i*eSTr"3uf ((5t=b,m9oKqwr-*+1Q&!C)K1REqF1+ BXjL4 R "+`S@bg7 &BA; UqBر&5w~ &N vWqѧy 3>9hߩ~1ʫy:I̎<(Qv]x{{+bq@eHe+m8ʔQ9~)k3̫ˏ xEd1%Sf`-[De\ZΖbcI݀iS*4T3P"gȌu`)fOʲ\6o Kq{X3* dքI4G͑5{GK8-{%yVOV ? 3H͈бJڮw ؘ(9 1=9]`}~8V A@ @d(vyDՆ; 'ɂ n% ^Py[㯣v۵d87&y)l#(9i %T \(șRrtR#dӕԬ[P@a=(PԨ>bt6WZ"xDX ;t^i)VnRJh)hARn8fT6I%7V\Τ1@yqAYgy{KIHTAbUݎ-UP>Gv;7dbSN3zIMGzkk^jkq NAc/{9D{i/c&4& )UH%҇c.C'y˿\=~Ђ g gc\)oI>A\yoVkDG ScD3~D+Z·׾?f zƨ}l+cƔ;ߚj1b;A?$>2FĐctdph{ަ;'y|NλTo͌Ϸ42nm>;C-h_? ʹ޹XOFK88"=5312ウy6p&@Tws퀄ǰ4&՝޹XKEό`'.~~ӿ۹SDFq pzq0v  uFNg8.88n>ڿMpHC}w`m:bic8 >F [6<ȜmxES1ULy{Oh6;gm[V/A.ml5x .N'{*D{M^@v12I6˿ YRXg ;ŚmcdR%sis )8#0&4ln4][+ $ ǪvT0030_v7P<7J _1veHQ@NQf+H6[7zfE!(gLTJem#GjjGx22V"CxXwsqi,P c4!T(̘ ev-:Ï"vfz8D-Qwph 3/fyAu>H0h߃e$ ЈBT.֐aE2(K6wP 0*-UPf֫rA֢lS0C&9WHdsoGp| A}iHŽH[ipHpQrur΂q܂ -``,*M4}plsp?U|ȢD~t?78!VsHߧ,uz|:m@ǚiEc-vkCB^T%KP f\HEb(K "$3/'ʜ|E9XUDa5lNԜ\E@Q{{/#+xBWdPtԙ.ͻsu27^o8`@a%SH,VgT}A/{/u,QkI*[Fv!fX!~b]n$P.ףV+@+;!G0ep,b?R ~e ᷅(:V)J2#\(LL1DQ&(<@}T"i0,s[!S0U}i"FB5mnLjmmbDZi1B} Pvv;KGopJ`O}>| _FN)=Q0RcT=E8ֹ5M[g!ddmeYzF%KUDŽw.֓qTZ(c4*{a߱Xz1nqyRWnu?ʶ_3R` Zv-t-svӵ؝Kp\0u.v ?[ͳAnNq<^vD0:pTh~)8u,ݺ]g':OkYggGCv2nCrNwoc7<Ơ2ɝTaOՈ PAO<Ȣa/5^eS6x6>JcËRL0 EܩI6d1nxj&g $OD4I )Xh R霟2``?nFã/0FN7H<b{K/Mr09a;<09}><=|~wipC082r csQ֌hJ]پ~0; p)Q<.(A))v P]͍-($#Y5K@m _ic1K $0jd/T+Wzfa689hɷd-@`䃗|{i o/'t\zd[ ,A*eR,C کFf8ݔ==ެV:,:}N:L`bw~ Z0y+Tol@B b%sZvhB3_9"iD2'P %D$%FdܘI{Ua /$]VkY]2^@ߒγ/۞GƈrHDDD!fՁpV l]޶{uzDEk|#8~h? ?dU3~\~K辊ʧCkgŭɻnxPخ 8Sx6zaBl#+ө;)8 Ģ N`BǪtab_/6%zt_;|l鄛. b@SƉJŠXR1ePJn3I!RrqAQu @eEiOTZi{On7'~v3i\me!@A/(͈dU\*iPfKD8Pje*2HV:` Q Y 3@"8\l 10 k!H Orp[B@XՌ&w ^(`Oy'}t~0qsjvR-Q1.zk-uYzf11cRF-,%1/eٝ,ew˲|h=v(Jv5ilw̖?[Cʘ e>$7Z2G0ܼݗѿ)cE[|WV],?)d+F6v>=%;-䒙pJ#)\IO՞ؒr i2V@j@- G hgS!+صv vkCB^6)faU9kǠ1RfSH=>UfV`Xj#c˶gўlb^3J]A?呏~fͦ--fs b&ʉP>SRÏ-;G3*cJGU?A7Amx.n;QЋNDgu#KbXՋK-*[zP*8[ABBQmcRD z/u*.M6heqI uX_;a~dY%p m{,хdN׽=n~ wUuT4s^R9heЀ""t}/1Bݕz"R/:Cu_ MQ/h 6˷;#_ 2}chN,\t .dX _bRyW`,0 T<{q401Y卮 4u4X۟Ƀy<<<0gc!ϬѣоSڨ cּEC;^e_^I!v;MZПjlcM}&O '-O-R]>y(8D,S EB( )4Z0}2}lx":c}qd29$0 b:w td*mT'-X^P1?{ǍN0EҀy8' 6ɎdE3r=?ŞѨ'c$ꯊdU 0 н^=a E}腋Qy^3T*zz pF0p&H:`󝁙6WTxy!a瞿 U) lU&r[>.J7,Ȝ/dAeDRKVT&&K^(Rx 2VhGV.QZJ6K` /A!\ǩhG-IGKb"8 C!.(C !+oSt!䔑oDf3E. ug U5c>^K8S(şPxU"#ep PǪ 6k}@ÙhFȀ9ot&F:@Y!qV(q0ek ׆^- c~.6NZH!hp d`GVaSD U.|7|=5 >I9)'@EJkBġ (1qAHKB V@\@V+DK[oK Ҙ`ea&+Cm( ^e XE5(Bo(vӃn׋Hx_D²Ej?_mxwǿxkZٴ߱wo.>^V/xs+?\^ao_2`Bؤ7}Cɏ?9n:[ IW3Wv:nUTQmE=b'ׯ3L0 tOm cArrnJvC C6 a۴*h%[[-/e$2dSn**TarGp͉y]X,JJКR" Z-] n'.@qeI3b NDKQmŎ`7k;mH]GOWӽgJ!$ץDO,KLw7m\ |dv(k:͞[8f.X蓢/3 K?P*: zQ2UU{IfE$Wj(IP R H-WmS^ V\=}K}:էdoqI01|ЌFNQXԤ@:tX˓ U#?cC k->$R%s9xɄu ڳ&Gs["^4i%+52腐s_'7hQYHY0@! '1q+(Kh4u䨵&BF)О…yʸ4yxCܘAQY^7"$ P/L0RiQ 2 !ƈ-##rvCp`SA qbw+Zn%ܴS;ZݽNCo3BwbtFݼCiƭ+r CwJyƥǢQO@i--N[[:Z~o m: CH( V pGI5^K@4ԤOvryqr%r>oN$~ %P3 w1Uq$u勮>M>I:`+K l6fgF64NWYsDd]jY,H3) 弟hZʉ*g3Mv?nSh+ƂN^hRA /x(WڹBҢ(P;kחe(VCAMڥ %4&X11!d ^zcn˷5tQB!&ђ LcSB-5`M--ECx!~g,u% +Jj͔б)\xtS,Tu@sm+B\herO' m<2RQ dݾh9zD!$8G-=2mqssH _f(ګj+ʻ$}HK_}quycFYC܉l}˻4p1jwiNE)mK`mJ)TTi V&?ԣ]/oh͇SߋMSvUB*)TnTTCaj~%ӮO6r P1նEpcr{;je& ?MDQ=RQZ-X꺠VK P]WQm MewuymgP=Pߞ\{pFBq5.9:Nl-M$c.\6n\o <#UVy[Սr "C'*LE 'y^R3)YvЛ-T!Ͻ OOågt{ń3J"#&c)hR+Pc5 366t5%o&wa!Dlѻ0bb:m1}n;[M4Ǧ6Ve z,Z}^{S>'y&PFUY| P@ V~$z!ٹk{c=,2f{Jqǣe-EG˪Os5]86p=/?yǏ8p3;Pԁ*QgBQ"2u1ʠ5F217,%Z0ܛ{BjcH_ܲ me[,eo7\ p5PJ;e1ij}CG!QP8c\S5L ~ZJH^e)^?|F;q򻝹O {s_ QԳ)հIPfW&tm"E--.?O[Ac9,Z@Jaqĺ+/ 9Uh˨" %N;L:<7+ vNf684ԘKN٫BqB>2& Vru#ʉk-`@9!^`G Iq%PɌ3RP!L](ɘ,9V+2DSmI@N<0唱P<+ܝLwR eEJvb P]ۯԜ]5vqgg1M 3ǡGAI =%Y$2ЌPJj'Qؠ#/Cyp9 c31 65Uj~.5NW| h0ц퐖^&6H$BoSԋG便 o׋NH&"8,> '\T׬2; opz~ ۸{ azN~)jn:[8RO*TOyU?seh3|sG2> ˧O_C9g |y}K8n8-pa* 5ORݱy %u#%K%gҞrېKNZj'hiVm?|e3 C$=ni!.r[Gh!-cm䨎Bc ǿ!};OrBa&2ь V;,z |s:lAtМ>j%?DH@ b۴ .\Wϸ|w9~ƓXN4{kg^wW8'mg||ܥ?>꩏OyCsB t0aQt[W_u{-~F/;P{82l͡W/#:Zx?x؀@ UWcztlafcO_lWZQ&Z11!DPt~X}_q;>nfv?9J*_p2s',nF =wW J)܆ *__Ƞ n$G7q_=fB{l(Hw++<96Чݾy @`GR_O]b$ҁAwEIZ:>k SyVZɥn7V%|P6nfТ/A"pVrji)SR)I {d%,`hX[VrB֣ LNԉmŘP1uP1uQC,Aƣ`^@Lx)?{V yh/U}d .2諤.6)_TSDR<9<}Dɣy ݬKwW1AY u8 AZmo?7C8pp#nb- I'e*yR. x%x!pJzwQ#c|Ou\T'i@ Y U*dY IJHIU3)p8fxqۓ4c~iggՇacaGkJ5@#G{tc3{DWF)k\*>nYtVac~u&@x+NC|͋,$CrdgɎ;#VBvDŽ&f1pZy@N'i^1L ư"I:(% bS AhU> R9:Źt(^+ucf|{!(Ʒ`5KΣp$ nc̞@9ߊs-k6HTy  ȵӵ=2M-GCQaI *'3F@dbO2QykxyT!I<='1+}+WF7CpZJU[i-9TjXbZ_RT?QK?|$K] b2?r~F2Zr^%~>}gh!9<_8nr{vOe=!*kPcq<؞7 =hBbFx Evru`j{&rI6W3x - [/c<˓%e0._gWļ-zơ@qh)QmQ 2)瓦)3R"'w[>Fħ ȞP9;ZvF ChEJZqOv3wVc ƨ w?ť_/HӰuߢÍ܀eJ-`7k dcלmߤ.2уVF-)gYo-\ (Ńjk-KP*{bX_Be2 d}?!tNg"]4Q1|w+P(z B؆ ލ@0q5dݭu{6똿]O-wA"O3mL;4Lf׮$J:NoVyt\o%{,CF\hpoŏpظ7m-6Z>9J~TJb^C`BPb P ΰ?gX:h8џlrF+G26k|5Hv%l3zRSaXt6s^^x)AbX¢ ؁3>F+{!|^"X)7>Tw ^BzdW;op<01&azr)WnlO#8#>W;;% $BF4v9e-ܜQ$Z19h]ԇpv}|~.زp9TyjMd_]ĸւI 4}|cűV*Moi !o~=$oGk\ Z6\<ꊆnCã>ZSơ J}rP6 Nk##ln |~ƞ ޏb:u oߟߪG6c(4[2J(h5K2p˩=yrv ΅a,X_2i=4ZCqŢ(N~ 4iܟUVW'g)Hh|sBof|Q#^%iy^/0:),m\MwoE1w`VG oej|~s/=vG3q\^;s`v:vԯɆ2q+7')dg@s$JDCD@G7P"y=Y2U1ZZrA'ROȢؽ>+߼?@8rP(M1Ph/"2& HG wo M6oATe|DǜP/(~bm7o2f^(QRIx6j̈́5^(K7gG!րX=f+96 ,X/s#1^y d->+4Eao7KkI-Ȝl>--7g7 Z)O*[)fٽ>+<&uw>IQۻ8 W&3 .A,0\X @auwrל d8ELEN.?Yvr+E?qSF3AC"ʌyESApwRBU=r+tQt2EBV@$$!Ygs6X faDDғiwfƚ{:F;QB]Kϕ]8v7Iښ{(N8IMZs:l fl m(Z.$̴:,'Tur%$*Ӿ֤6weq>DFsϯwmFŗ/@#uM֛lUnRRVO(JH$g6SsD2KI($6P9L[*..RK8*_C{ ݮ$I+M*q^ kY;M#Y]Uf+;6j`Ҫ$7د"L Y$,E)Hʺ7AH](.-P5"qV9^aĪtSCfO"ːS .%#8k+4P|frbOx 1{Y0{= ltg%tH裮W6YbM2e5XoHY}%$>vC_qqvu5b*h¶ ^֔a)SŴ-ace Ĭ)aUBGѸU1AhJRE7S!e ňR-*geO|5#ן#nM~ 9e,h,)&@2dJzt$;6Z:/Lj8Em2Ta).UR psLq [4d*c*KUEǨ"G",%v| )RHBTdʂqTC+kY"vM<.^x7;3V1-"fAxK|!ǀmѱlIgM5qaP$r n-%C1 Tu1kd!eu$lrNQK^4#a#O[XZܼ޽ IeB@4pΓڋ®\ R]➥쓥 #}fΰⵟO6%XÐE\;,αKȺOFǩV~;((aB#$$.1$ (uP֍x|50=N&d%Mh%2:[XE$6w"QI5.0Y\m%,j8!WWf d%jѤٞ!$&WYOp(S = RF.Aƚpj e\Xwgdcuq`,@25KMY qR2dsqi1'lIκVOLa̤}6F襫+Mf_8MAP<4>1@ҭNҤwś*ˢr zU44+ +~$Fъ"Q4XNckE7,~9$s5R  PhUɚ"WCWt㵚-P}W(f 0Os\O\Ech]Y Wtq{^R4\BvĬjAlkmMC%)R7s"a%a"T/Kɣf_m;j0bQHC$5Hcem˒GG/w†V-px'ɷ?쿼϶ٛ/#y-GgzU45?&ulV/MW˕uĢdQpiF'תbRU/`׾ܤU{/өO 퉡=1'ОӀ`h]Pt/YDŞ{bOTiOT_ylgKgY%JZ !M^u.hJ E%M2+@q"]Y%;8z5a|FI󋍓r *`Ds  YվH#_Z'-I8RfQH>}ٛSN/c!<13&^~ܝW(k@IUo ē%q D 5 uEBw&)BѸI_Ţ$+UɯhBBc̓G%8@7bt4h%93Ӝ:V6j#%ƛ\UeniKLMxpq^Qd֞O=+lC0Iźf2c b *k2"Qɣv%_m Q79rAjfRa-^xV5$=346J2SDBL=ׁr@db/6(n1ŘmEC.PQlfތJ,_:A9΃ӹlQ ;)u>lF(X IƇ,ɦ(DL5CxUiv@ н&F4 M餙_|S[xȘRAd(}w9^wYz"C:-/~nv~7gF?؛7?>'hEhd]yu]xѾ'I-}(jr|U5ʱly^Z_RiߙCTZ 5LjP>tҮ1%0BA12Ùh+޹Y=^ b9jJwI\kq\Kzwqk5iw:tkǃ|)lW]i2E^Tҩox/$V4=+\@`-x; .L..Z8buA?"+/ xdb+Wx*Tj]*k255>҄Z4=GHQ1Zvλ`I*O&e@٢L3lB4RÿJvcHvW?wgR=H(]N'*8z،`wL?&j_o;ndhNHc<܅J1 %%6*!}R-V誹ŧXFD Ԫ %K ҠĈ9Mc>@mgw){Hz!M1I./ {m0șB[7UJ+L/4omJE7"àYyZ--}jV&W̢L$)A^] @(%]Ɋ(t]+'{d{ɑWQ|@r-wXd dvr9ٝ!زGgvvqWvwK2Ò-MXUFr0Sj{d=701kJ\u5,^,&eql}w]GG kgӴ٠aeWlYYlr5~]r*XͲ._\..*|=Yf\̎ʼE k!RFH%!K$L~|zr~mI*UdXɧ΍ά^u2rvG+ ZKYVᨆCv9nrdIgJ_TƼ\t GjIfOfԹ1tTbEȁn4.L Zo/_t>wݝ3@_oqgl񹕁oP7kM//U'p>/걓|Z3|E\ªlab\7i}S6xVfޭzmG; i$wɎR~dS}>O4zEN!JonY:u1*T LL\wӫww&C.D% m%+xVfSRȫ mĖXnVqS0UL obA&.`8,7-"5~Ų \l{/LU0mbsV[e$LY4u=Jszvƫ˧,d︟bDh%3ٿ8^ģ鋳Ya>~ fx/7Û6RRVKᷥRVW$r|^g/GK|O˭-Q= ۳v/O>[pb-<:KLJalb# D|eoS6"8^Q̹ E֍`CfE=O}k-(QFJX/{,P22DcEdD  yg*JLjV,R Aת&mSzOR SNj[*1Giifq?_Gwe2 Y$˱DIx+Q8'$NDE2U*h[ JQ[Ȑ1$ݪ#gL!,_ilI]+ڈ1SqYkR)0RI\Q[a9XۂUE&l>{LdzuV`n꼲2g hB [H'eR+J[bJT `mI!S) RB |TKcC:1!zJƩDL؎Tj t[ d9 y>gM*o"YǪ$"& FBp83v951g(:[}p(P>DM&U9vgcer)~0CZbf'Qw" hQtHĦf Jx-Mi .::`x3* U@ҚX6 y6CM6(v)(Xy(8m b j)1 ySn:JY%`ڡ;}6.L2^(M;` JqHY)"ÖHIixLՑJQLY֘S9K6" sElonЌm<BA šY՞EoV5 ̳h0-|LMjxr:hn͚;KJvauSๅfS#Bl5{]m`V1鰑2*c_fj:dCX{b chL/'ը7àMALH( 6%ڰYKXB-eŷz *0fKZp\/u^!l/ #Rdsx=n,/aT<^`gD*"d,"{19M Jlq8ZD \˨LFv&!n*w z)@3} sd,X[-*xJ1&@"OMYBJRǒDV*\4ZkMа%3DV]Z)mBA25n쟚<&Ow&Ow&{Mf X5EwyXuJDBC6B&̨9IMvܘ"b9 BN22.^K|Z͍krr{_zwdJJT&4m,B$"r{9 [l]ȧO؟֯*cv_ZcשyT *XrfGg^ϰj9fy@ޞoH,||±Xr9$<c@1q{*R!+!z2[ݐpiKqg3^f0G_kQ]̘BzYRpGU 0qz`w孛_ߺ^=\M}X:2u-v:c"9k𨬐@b{>l{C%UΒR++ĖL+CX|0D6DT.#zrBqݭ/צS~w\N^|;}:.7:L֦Chua羗%TǷlwe:_&]=ܹt|a==LLMӾH$~_J6ӊO ff?(;0o˝`i7uAyxlƃ^ߜ[p7 t>w? Ɂ1 m O`~:] ~nÄ)o~u؎]Ntn1khfB n;r7ݸy?r a *pb GcϾzak>]yKz?y`nP% 璉%vgӛ^tD4E^ئaqPAo/tv׃7 `KhEo0&c'po.ON>"5J&}UDn¯Y+f0gycu3Rr}֋SIIh4[ƕ\O}9g2$[3ln9RJ(:X bGs'+W׍®([(%߽<-Nk33fbs?Ne]f9 Ü>bwT|Kj' le><;ͭ [M}@} dhUn@<[x=\0ÙL óEܫm ~Xe7Y7~t?4MI&u@Gt <|c5q 'n#F z9`:BYb=FED"2W;.K#YUjO4ȷJ *̄Uk~74o_DHT H݇$)Hz׮Gf<<ƓQ^)R=F)\qtR*NJܓ$O,ظ 9LaCq; Jj<%㋗#ěo&cg r㷣.Vi}מ*VEP1[r7Q~<9j/+^P,G9į8Ra-Fy\\jL1&HTpV$'(\F&Rd4.1mWܛWft<տT8ᴛ꓄q*YJ)l0T뵱jIoUZ!B _!3*NٸcͿY@AݑeJg҆:I߸gZsA_Ϡ?p/z>ega4g7*i`m..vnqܹ(kh@!˷gS VDU1qrꦡfI2($EsKiȑ' jlТ$m;+"@aOy꧄ /d?.vےiF&w~\bC p!I~F?fuٻ学w39>Bzo^rHq8zNNL*q8Xc"He(HE'nZw s٤aV8{H2yrDr,F8 1/mv}>Q瓱k!)oyJ(@ b*G(j+-Hj _&'׸5|@nyX:-M4,g#*ǑCSxDh$ @&OŤKRvzhy6[/![w#|Y1'")sNJ'?r(0z D5/C^/oP~&Z׿CA-QM_OlG][xB̆N A ͚U40 UH=CĠsA4KXe5uoBU D UHrPPew'T; U}8`^3V+Z% 8CJd <մ&Gs)|<3,ٻ2oΒ -{NV|~ҧbZ&e4F+4XNa c4Q#O^9BJz>J(5cZj82=<_q~>3cq0? S;2V(9V|% m1ȧGk9N 3t 4:xHE|4No2 O;p N)"! 3O[1!oRTnB*J3T '3BzX0!)*jdb)='u:bOE17ff?]_DngqII'FT>|5-qono0:(zI b9RL d]d7][e"6iٚpl{4Scۗ8!3Qt h8.{ LPφ;o[E][cnӉRx79<ŒHvXৌst@:Ť^i TNp!gLy`=,(C-!R#4o7=odG!g=NDT3cŁT](kV#xs0 q|єFcgŊJ0:e94H#UxQD΅a:0pM+O\)&b4|o,9v@@E(< Rp7X &sj0 d$Ċ/!se.3Gy~XFMEݻUHM{xJKkI$O|'>gd<* Pz x Yg#Lu XraksixfyݫN5m.Sia6! _Eu0"(΍:]p̃!8A lV{ ХR92"ɭ Ta| qjf`T"ƩM:#iF(x>RY_#8q(Y05HoN~ A` cQMUuL0HH!z~-tCINE`{8 sT6Mk0!q pKQRۙ62fJӹGtZe_=Gs^8dyeThhL! 6r#E:j3 \s=2\ f/^6\CB/588G4cä1a! 0{xG5zZ+@sDf=kNk0O^a/Zacʂ+hSsED,*X,C2AhƃЈa $D4g)ostB-gIWYNnjȋ\D,I}cz9_!6e}}vyYyvs[IFDR̺eРdUqh؜(xR$os>ft$v_;Ί;EdpB@{LXx T܂5p"1$ "~DdRzm/=/ׄx_9 SyǶR3^j JaKŧU뀲,=}HuXBDi4r:lX2mm7 Q /.IћKt?Kԣ7QyA%i \ E %zKYJZ(ӡC9Cj{GT)8JU =#q hDݧc[9pW `wˑoV"xgד% s5|NzX-S~a"@hGEQzU%&#$}AHD謳|tR:xi\c@\bi #꿒1tW=~߾聆g1)}vgHdi 1B@5Ĺp.FL uoTN0ŘbTa:Hp@t )2o"Uٔer`T1'U&G(BIq#Oېr~tNkQ*T V{LSA1=(I"*OF((ȁt!ۋ~OMerg"p*#g!-!ۓ#,8!bs&8 &E[B++J*Fob<0S\PM,s$*)DzTDZqִIJxÅ8zu6WXˎ(GZCa5U5d4g&hJl}4\(wFq&?>¼Sr&wV5R 7vU %#s1JVyUtQR.X[* UZSw2{pg[O|^rq1~uXǧF˫K$X~B]))eJrOGlg?+S}a=V&o/^b?C+@6PY (%^dNF!9h .HS' )b ['psT"# (?ՍnfQ7^EXtܟBrUSQTªȦztyx砨n&+f;l91+o p7?z{%L򂃽d{(ZYJC<ᄘlP{W睺 œ^nt]Ƈe5DL(P],k$=_q< qk5*&|R~i4p@Vdb^Ȓ^s+"usP :[Z7=PH !3 ; 0D/ \@,xKh<`2R6X@(T\2'qsr By54Qr CϦ3SB Cz7I&h8;`S\1DBeUPZr?QxQ^yoMoj:#mEXKsӄ.-ofxDO˟+oO\_WѪ>m 7/HXf笿:)C9HXN+gyc9_i7'R"K֜o50SEE,cҐO]Il8!?E)Y|)r"YOϳIA ^Dkl ca>v,JdHe SUiUqHaK4fqy=W_C9Oܼ%V dKDTR"!J{PaQ1e C,;#)΄ӆGӌ@&bO,> ,^WO+)[.j"H2~rG G65qQS:z,'[hK^zgן_=^v|W_z'b UsT ŀĠ= }s!1ԩ"VVihkcLapd"-8\p$-<.|^ pojY:v-u/7w^@<5Epap ۷i=-uA]v+'.nbn=\F2,}swwWcRl*|#Lz.<67Ց"g9l9ho{MNq0zv+T:(:%=-4ٛAs+:@Bw4lFw)y7* ? k? Ǜ}hvqh,g.gᴸ+xk'듖mUSkZ:i\UsXrV_-`Ato/ ]C`xf?_0K`3gaXz^iƖ kjm!~ͳ)^YtwNNgsQjثaa,NyozФޮFsu Ò,R 6RQ a`iA60 #4wwg I NqKozpBP>hRk&+AfVD6>"d(FAUΐ"o(iA.!FH]ony{mn]ƸT!F(XSK n\2$A)A:x8KPi"FFT!_K3J^6\?MĈ\q!.YF`|Xze񛬡ӞI}T1 IT<uw (xG1hE+T2R.$¯CD"}t*PM8Np spL~燣s('ن]-ApwO9OW4m#fb(mIڸp#*dd.F8޼c< Y !p^#e CbR1 4̛@Q{G'D$mF)D^YJwOCoK&sGzM,G^"qΐ@0&^.kxǤQDOH(*cث+ 3ϬS yb1Ff1 "`HpLD9r)@>Vxo=p)"@#nD%D0{1#2 $FoE@p rsA aHxbpct1L R2D`H8ôkiL( Ƽ.Uʁz V9L `!濍?+'D,Crn#ՠBDj6K(aՑ;M]Few^ Q_Zb+ás,oYy>kIYKs"evaQWŽKRrKIL۔oӲ3o-9u >լI~c9MxPF+}`>'1 PuGt(K]o}KG>uk)'`>%Ch+*{8V\P#"}j@!ϊQa{fACEY:}46ɽ.El4<{5tGFx-S~-zXӧ_JI4CTr-h}I+]uZl6ewk}zb),~끓09g}.Q gQQ¹ȥMBJ("X4.a+1 vR*3]Ӽͣӧo^ԠpMq;ӓkqx3hQ(Β JV b&99ehr8kT44i{\ 5{ul+)8g)":mN鍆Xs{+{YtWWݛׯ}@S-EB r!P00M<%+\4H F%ümb4lxɓǁd R@HQi2Fe XYc/%ֹ 2iIrIIʀ4wįQ'iI9,(i"$‰D QW 1 wWo/چ_s'. ~`%`h9H$bRFAb Ri=K``7prI͛RC΅??\*jF`d58)y O{~@&ޱr1&Lb2v!&?<>HWrs1SNExV F뷟yYFDfYgSf~gZ.*V]MR83I S`}'s8k pmNk i-s>|㱮N\XH` ̽"@) ~F;B5.嵶,8QV#_ܞ>#cn4/?=bWųgE% %"m(])C! Vq"g8yotFASC bu]Vx8wwL-z1ӭ N#,L[D)Dzap:&Q[1fMԲ>Tv -MGrUyͅ8˽>Js>2t89 H`4ABDdG +Y^ SnхXzc|i%l6ZNGp~`*t&yIGsIou0:@2%7IDai;SBn!MC -xWͫ1?^1lݶHH8EN~ӷśN~Ӎ~wWŻ|{Pk3-MDG^GM*dE EI_\Vuz ǯnНuܰhg,ŝ$=&"Ǵ|)˱+Hg}t:5Jvs!'C&dzh-&^{`[>9${BP0%gMFB 8ncGd㤋NWDe $K; c*htwlΕߔ\ /v\Zobe81 h\Qx v6@ !x2fsϭ !q$ZsNP!9F?qW5d?Jp2B$ ,5XD+-AH!ⲄB@ӢB߄ Ǝ*W mNniWKwpb> X4 <[ɉN!nCҨ,#`( \{TȜJM/ PƢP1q!pNP߿,™Al1gcݻ^ݯ(?TR ؿ͍Ehc]@{{q>*X (]ƢcB/8(ἐCDBUf@ Fz\Wl6_!ÀRdu5hC SA<5@ۜzV]&Jp M;aLR5>OS@TF I'Bd0u4u+ᣇ 9N@[љ}z)u/JkMܴ*D&,fO"u)g(+c&n޳qB/%2Q`N*$%M:S(bnTNd=t͏;www֨C8" &Űisw ,8aXNAb[f^m$zq寴0NEtw9CһWۦI%P!Ssl'PVD%:E%eZ&Լ,)iMu| (Jٝp:wf,>nCV~ .`@O}CP Jj Z 65jMYP걄Hl[N Uis)XUK3Ń KN˩ٍ]m'̬Y}R@i73'DCI V_$3$ai(ITÕpl.!c4 段μz]Xf:5˾nb]m] N)gzzlvp%-MttWs)Sj-G֏r0DR6W'{ O[Ó t:Ey|1> !^+!kCy2v2|RTK:K?ogv*M'Uӯ 'UsJ_Z#/<{W;0\[]S^ZRbOѪ9+K=ZUTsfQ qM(E$n+b3dŨ`~+!f.K}*FizCH?d?\w<%aW>{5t\Œ 5t zi H+У!}Z"x,P fy㋰}c6T-c|!v^컛T2#X03/K,].лBz1_*.3Kor EnÍ}CSb2A,Dp?"_s_ 3 ℜPq˿ޗ"ƯWϸ^_>>=/+eU'4$tHc]yNE(M< *)ĤRwm=F 3Co%,X$1͆ᾥ/^'%Z-Q31 [}Cź~#_hgp8er>vj zu𹮃:\k[E7K4ljW7twf|lVD-zi;5 OP_=ƾ#ʰGZHTjxWt(NJzRT~'3Sf$[ F).BLóeU&׌Sĵ- )o39sPȃR3dim[;\k!7C;.'L;P:S&1h ڇ \Ilh`K%s4>m#$ti/mn񶏀Sy:k) Qg80[|u!-52_$9H(P@J|Ɩf \XOz͸Kr=;˘oI+sN$$pA@Ǭ%sޑ2s $,)Nlo {/z mTdLMܻ_>`P' .sE6΍̆e݅HIyk J,JTBǽqieaȽA Yx1k{jW[)X}(YUXsz77vgeR"n9ߟ]sG-׷"2Hל-Ι&}GɩZsC{߰dC숆qSGLNLI;kb"ijecdi?!#++G%$)Q zA3\goNK/cGQ2>Ԓ*JK-%\8 I4 /QX/,-WXv*Zx_"FЕ_I/՗?%qf͐)1\*ܙ4C޴̎w-;ۇ_%P[ΙmAt*ݧ̑a&cUt~={lTBN#r;fMr&#孇$ԯcR]J44+3!1.^d?ΑQki#q,03R+,<\b\{ Ⱥ; /⻓ŏ/<__2ۥ6GL=&؟&'8U`Ѽ9XIq_?Ů z4_[>-PtG _-7߫-]nؠb5^֡7<&#ܕ")K tàiAY!Y|7Ҫ;.>VDO ]oVx$-tc-VD?䕚#pS&Qoh鲈Ɓv>jmHv(i(#vR3əHYA&9+p!:JjQҀӁʶcΈX7ݽյ7g}ZpmjCdj‚띫LY\ïӫ*~=jIMR^7ZX!T'@U?*z筽UyS!pTFقi%Avzdv`I~]vc ^$PYJ]./+]Hl\vdlz09FC \䙞&/Ź~ےfwYPB.LT JMTK!shd6:H.MǽϬ' u,J6{j!Ч.B2D&LZq9pQKN]:!4/^D<R Mn6̮FҠ $"2_3T7&F3@rtzT1iN,w5rXktUI8+H?5# qo,cDvhx0,7r1:5# xpho5Jtm#$o4ԁ_|,^C/މhȂlD$m4(7ґTL7jkF`! }eQq#Kod* M,HE:A85# x`dlq:e-0n}6gQ$>wNK)@ =jWCdhUrYhS2ȔH2!|VF †\en5CAZc@MQk>H/XpA y#%6/-? *F@!0^|! %-"T_37lwLLGpXdD2VGkK%kRyu*2d`9}aULF?b.<NKu0j}3rNp}pCyz}P0ii1$lau[m@nz,hPp9X<[ $Yk(̝.I1a'QYeއ>jBitMѽ5P_L&Pz%wϑd]+n~GdWw̘7J)EVXG NڣҨ4ѥèT z~#ӡu(G z]'^p-jӣ ʶ jF1rىs7YY:mۇ۷$|M7g_5oL@FHPeيn݊l|يni4,# @7fFL {]!v'lNv!2tAC5@5mp!؋dSB(1ě#9'EjDD$)u@$Jypmp}Dq[v{oh~`FS[_iLG-|N{c7\K9#+wG>dm5Cz~{mW.J3r8cu4=II=ǜ"ooT `MR^Pʶ.菷<?π^lO3<~|S"hZewqt^-iIQP'ourPYeSħBSD Ս j176GW֒Ȍ9Aræ ǂ"BG<Vb;[֫e^64idsBcH$EV|aoƊ/]wRƶ}?$د:_Z2X˯Tnbǚ꒣]`ZD2̐UN:s%f Y q&LμA㬳&EMQ9$w$2Fee .l-.`E>k,袶۬'P15p'b^K], 1%U(nx-ak'`G.m9l%2@J0Х,xh 'UY+]1X@"7) A0IБYQhCJC zXrx.2kآE -LƇn۫'2ҠF}1Z ;Jdv2-Bk!83g^屎ޕq# xd{X`SPrw[ށuuVUf3ydJ[$_`0bxB){wJ.ʔ1IssQDs =X*fmQsۄd]Ί*9gŶu{FQ*g,;oaS|I=>Pp5>lk'\ӿ<|m~rШ#N0;Nb ʨ}[m1k>QA4 "UBg %ZnduJL`!ftƇrtg@#tMB8p¡prH!˝,b{?gwԊۯg&ndo7+NA` Ir38#V iAs9PZ!Zzf(Q'ZG2ZCBA͕U̥@1  1Q*QWUz crӷ07fsǁcf8{ }2i`)6T A9J,`AOSZ 0d+ B .~,LB<X;g)-1jmv;@A eH1ɹA r+z}D6doʡb)K)Kk) _ y<=5wx fw.4LCv9tz&鏂kNOxdUC^r{DW^#Ѫx矪 jY=cr:$ϲ@VzFfpyaBPy!U@,QJ`PpR8v2AB UXI Ïd`58΋27qT^N̽ɡfP21t88-q}qWIY)v ǥ#hL4beXH\X9PTS5$P3x\"  $",Q7<5Gp&1 @YC`3NphG*B lI{0pε ;*%m}:֝%Ati^:dA{zXT *{)f@_Qhu(fT !c8i,D;-aZ&c#ذ()bMo+$(X@$ٰeɿW 1% 0Hbi-iA< ʠAb!  `TQAn31\l7yP$H88}`ןA'P="3*Zİ`jI{ĥBf]!F\x~tFTD0VAB)ՂPkN5TV"2;+b  GT@rsy%\C&*) )[05#o[^y;ܾ xls1H;ƔDEDΑϴKjL D'sIm1B)9IkgGڭEtLIi}/Hf_S)Risr,Ͳ_?{LB{dCNrz>`q%w9/R>`.u@jۺ93h@*p6*yuM[UDH]Way:S K")_uZ}%mOIڀQW W;b*{sN_1&a|f8|U\|v`mr06kY_Zo!$uFN|V:I$< ohh;z^5kNNY5R6skSaE=2ƚ!X[kSub9Փ2~G9g wQc3 ڧu,DOg]gHc&$v>~ԨR?}众yeȎ/h%@WBS/E܋=}CYfݵd-"۰lhgwpƩ#e˺EO͹9LKYA{s?kϸ,Glmg=|6yA1_zw/{m?<VSs̼lln-gnn_`Wk #c"@V1,U2@IMHAu }Ǎ‘7tT({IDU(P h TGr&H&V &~"2-%/d|M|bk /S^I`q*eq'HȚ*_H)g8RIYwN}zxtϫW/_1*"xE> j^QÏL跻&Szx;g Xx@kuP)K\g3fAO>g0Qxie^|RI%'Og^BZ1 B\Zka`اPb"l'R c}G(ujbJ/p>AߗJZ‚V20tJX-QH >rt%wR8/P2ơҚOq )i4JZjFQH" O1/?y"u obŦaZHjpa =̘ǓGT^~.&y̝ͯ-HgFfc8c7·|EnOՃ†ph q5;4yakHߑOi4'No"~ߞ[}s0>J'ٺo>=}t}!~ {%խ_{1GeoM{v $-׾F>nw{<^no7GA^ND-0Ү#CR`D< &zakEG/s1%#o?އ<]GiY]AtVbPCe-1f( cd+D0 a`pwAރ,zVeY;T}3}vi$8>! 711:]>B/5WYgʔdkeD_~yyǬ%_ab *$/rBb3LN0i;7cv (Va"f/qsP@^fAn)g`)-xD= 2'$ z7zrPvlav} G BENM]QUQ&N{ 6pN(;Eap(P'>|ӳ>[!bz SK?J5C)ļwIRzڙBϷOq)U*yLW 0 x~ݭuZ";h2ZName۱ZI,$ϔ$9D"䜴"2͗֋R%D&pa&dmwA2Jpr0բ0Sj{ QY"Q hTZdytyFC"X(hM,<=2asQИy"XϺ)efCBz.z˪]Fpp~~x|<zBQ4m/W# } ZEQYi~aY$ j&YʦYeY{󍬬b^`@#)$а&TwspxE0Dz jb$Mޡ|ֻCKk~v6 [q)2Y=\^%w Q& B7+F>y4uA")h '>tC:yzr}wiaqUCKws+$ 6z1POQg> ||aF~0t`kFRQ_u,6MN[*ᛧq޸09^#n>T'zY)`Z<(k [1WJNS f독cՐzfNx x~Xxin؝~;Ibw^.:[ٗx<}v6 t;|w}e&<2ƪ!u>tfI7_FQOG:-y鈎?#ile4N40 E䂤" Q2#]/-i/C/a1Fd;neK0Uԝ:}`bUm}v~׸՟qaAقȫz7M93Dh|Әnlao'%)R{Mƽ 8OOmFmYqB'sGy7ظ9h#SFeKN0mݜd,4 c$l9: m[09E@ʊ9ZgZlHhc?V]3צ>]Yo9+=eOmOcNiZ*=z$9U̬]@-X<#8eag(MOS!zJ>!i5㾸zϟ1׻b sa %Ğ!PABgLD4*" 'B2*@V }8bNϖiXY.--Pk؉qO%JP1ҕq5pLA-2DTAu?__/x~}F~o)]5.s,7^TO2YH$sBTPdΩyP |TVbЯ5FDTuO-J$ƘSIqFWEf<*v4yaC[Q k-}&k<_>ƍG> 1rblR`v0Zz s[Nzo-OY\rO~( ^"P ERjHW/3psm9[srr(D%x yMùoƟW 'M_&+ltWAxOA/{C<+ 4ڂ 'YAKmf3N[;7zDmj~J :+eɓ@XL8.]^%AZm3*ȩӖ䃩 ZO U;͕13ܳ4i_G#>aV@J9a *#R`5pRPֺw<h2!{lLSnHbۦP;XTZZl!I gm Q`Zl\ QP?a8FNa8ci. juJz֬EuØeI2:GP*RF#ɗ@(9 HF{f*dD:Lr:V2y2E8=2UlITL L)"AXLG]d%I;5JoMT/K+#^W( אD9 qAE %=6ԙ 3YEXŬA1)O7&he9!bJXpLxF1%`EŬQp_K"c`sՖ`o;Y.n||se٭ϓszpn=?ZǨ(P9' XL4jJXI'p4)(+ѠQ H3#SW}-_߮1s+kRI%\ʚD3r#+7YbdnO=cr!OmA,i}Vv+xVԸ4WfӿI4̀*U=?ش=ռgBԼq}c0]O۞t>r3/j 4%ɤp'&|phԉh<_c&-d)& Q rH#b\V$\/[Y]|X_3d^߬ 72|h0dQe 'xu.u~7]!QA HSP3(?9b˨ ۴op)9Kv1IG戭63+ /"̓/gCf"Qg9 e 5bVWEXfn5jѭB Tx̗7lݾ猲d2_~5lK3Ɯ!7D. BhQHW~Ti'jITu{{%m$ yI4%|jH0aW8`zr">>2E5X7{:n6RM$\6o XEKfsÜ v Lh1IH>IB9 *2G9hgS^T-*q S R'jVt@Du}oB)ᨥy;bu{k wţV;lj-a-%>\ctq3H䕄hz$K^8uOϟѼod3Wru0qL95KS~XߖכnC;.zhp9**`bHtq$ChS.̗կk1 G֮srtڦceZ]\ERH?7T^nyf-;W_γ1s}uqrBs>-8j`"3PL6 sEDd-/8@m GJ*QhMhjő* md!?j-L!-3;js"׈T#yO^49%/UG`f6@1h%4=j#j]XG *Gl0ˌbsJ׫ulү7ýW|7SyOc<}t` x n- ǂ jY &+' D-L>QLoH  zo)H'JMVg[^dEp=EV gu&xC9LW<*v]%K8@ɷ[ʚ1'd~5a`~&,fzSXgYC.yT5Y"mCe]TV戲l亊J-Jщ;u{3;=htթnTx_Vѩf֓Si T1 E f|vp`H5vwӑlw]'S?9G?مJ(Ps"^mxrZhju P{ao Hgǰꖗ%;\ȊQ1QL{?*-߄ Nz/<*8._ R qqq8X6޷!LЧ@U^&pRk KGL4Q+HłKjI0O_ ũgg (M#e^H)%&TEv.(oyEŬfpd<,S,Ly'58)B5TR$EQO<'EЄKrD^16%$w7LvRcBoc8€#UF+ЛRECo )zf",ݮ y 1tu,y%+h2B?J1)$fv#UfB2;V2̠#W#wn,.*"-Cf~-}:O:%|5):hxrU"yEJ{s-(̕eBsvF) e/<Μn1E5sj/VMi ^MK 4-kh Vʐ=RYm=50V嵹uu^zar+__H_1f WwF\ǷWEE\ TZF'r\cP;rsfђ8`o/}>9 |oGLʟDLB'3d>N'8!$VB:0i '0a=*P# WV}>)Y}ae`t)۹)6'(Wrs\h4уP"ό0heoGIzQ/kXZs9^(!RX48\2@.|ŋ F[_֛Y«_aZi#  ΘDAQ!Z$bۭS"S.ld  rEҠĹl#IQp" ~ anW"Yb?Ćl"ۥߥ%z2zoYs?-ﰍοp`$VJ-r ;Gz) Qܮ q wcs"@-rnoۑ)%2cJ[2!Le/ K:PiLQbLv(iL4h$VZ ..CZK;Pnڨ0UTiw! X#-TSs/TQ5)TQ@YdžWaz5e5eP @-I\rLr!<BjktNhj@zyܸ֝8Ϫ/SE%sE5~'Cc w+2&Sp[&0fU3R!՛S1ּu;j 6*4kjm5YFuY3<2Q'{\< 5QPhD/{)(<šbCL|wU%p#;PK:n A36;"Mt5Xh=UBn>@$ʟWA={"}?1xtToΓ2>~5㖽i=٪[.~> ϵ_y>O^Gό߾{lux>{uΧ?!>OݙL?>?=zꢓ_^_|r^~ᅣ|ǻ7>~5Wvtx>3~ݤ7R.mh>Jnn-b|8b@Q9'_Ky=XcH`4sW]}ƥwrMweq7.x4\Æ[mzcIn% +ws\$$zjrWO" sꡁ\&pDwllGd%ǜG+1,:cF꒍'Fw,RV1#&߻5l5 *yF;P+ltχtd(\4O)6w{FX(;5H><8^̻6kҜ[޳]49KSMiJ @C%тF%wfwt7jէ &uC\K_9j]6F毺wMD P$'C(>"̢UF:ݢNvR8Z}?T,Y 'B㜺,ni:{gwفZjH >̬Aj6R'!q%nS<$弁:D5%eGO݆&j;L>SOVdO59 {Z yY*aK%LUɢP_@ͨrf):Vt>7Yvԗ7Pғf:oz>QKuGV8DgB^͘S*p560D!uRNwlf :38#]Zi JDf &݁ڂhz$ч;Upcb$yǔ5Ǥ=``1&I<:$cP3ɀ8O> 7hIpjq'[F9"D1â=8ѐǞ5SX6  os-t};jl[ؽ+ZI\qh}^R0rDg=#hC>cM&&j[*jP9 K-MΖ-:L A(Yz,]u8oyn86ԗ7Ps`3KOֱTd)moC}fix RXz;˛dYz,]瞤yKH͍j<8E(Rcm;j;bd=i׆&ފmX.j-2{V`jJl\`6Ӱ%L])y(`neGV_;PGI)F#Nt^fx'"KV:'ƀtD-OBks4y1er:aHhztmu!f1JayV˛+dFP'VKy"L8sl ν%Ր)(HQN(2kRȍL(}RVtS/DD ;P5#h 1,Fګfh[@}T)9;X,Yf@=V7fr,$pU3IVFW77c$--XuH%aٲQ ؚ)!ZyD#rm!%'@ 6:kq^ߊUl`>rMe"#GqqKT ` ˽ mN)J*Z a^_Qr}Q[NZ]QVFG xNX*qX:Q3 ڙ'Rz ܲ{e/of <[f)d{>i몽6;>H('8q&cF,gN AW[-}gOFb́Yh6/$n,_Y/>Ro 2KBpϔ̰,cZLTg6\c9&7|l4KMfqpȡyhE^HC˾O'aQwhbbE…9 #2ɜ*]ذ,4@RAN- k&c#0$F BtFRJ~٥/cCHTE0"@) B,381) V Oq +)G պ&"/\vdyyldlV.K7!6$a=-r*J3 -+ 8*Lr8 PФx T  jkd/֏'3[\15`r IscEj}o](RkQf+A'^"JtNBK@kb;ьf䚢$Wp7sqNHEّENFs v-*JJ0jқ8-MC1tvI/&4I=1cфZ2M̊f꿪tGd 7r ]‘~˨gSQy-AZKꔢWػ+>Sz&(y0y>uЧ[Jh{cn)f=D}RBhXMC$H/.4OZv)wy:C~)wߺd^̂[#*Ǔnl4S>_+9f:י-rO~}ju ȶFGoڟymW_~zv[e/!4_|ts'P~'Xóޮb\\5 weY&KH}ZaLf `S? {?k+gы)h<;*sژ‚A!Cr(2N,z7|6˭zmwPMphv''q9İԞyr@2s[+ܣwy(" j!}EiPiQy,!A}"$^Wc7"iA운G|~>X$xqh#I]kIS7 H 8Vњ:uh%h$zc[ {ߖVx:^^Tw\UnMLUBY암kМr;jԪ2wK6J|3sň@ʠwⴎ׵'5npvv÷\ t vx .>_ 0 :J!/ݺD}/N|-ͷaAP YMә/b|vw-#V@ZjH+2;Ep𾢠r{N -,U~ jRd$_:y`&w'I{&7і#/#Wemd(r0br$\9/ +bdȘ6|@6i>  s^*|{V|a/¯TBy9!##a|g,/%G&v_:@dH"Y=;}3[X8ޮ&, LXs o.O2 k2L̲>}ytD2zp%(\*>6e5³cSThH8*4CPҙP KHT9˗3rg;7A' i dFM܁j{5fR^\3@[ZlV~C~6yPrO ]޴L\\]/! j^y*? V> SoAxǰ%Ġ5u-ijobmSNj$AɗiּYU !dzc˚VabiP7)PrP:( fgv߉b͎C5o+&O=1$[:օJa*GZℏn!TC]ʎ.!T®$ՙ3r6.q@^#u|c µ-k c90 MZ]dP7FxjeL6+|Nb̸^ A#E%hn'X =3 ғ2ϼB &ӆ'?⍟,vpK4p& ֿkp2I7`VvZB u[iHp4rCᠹ`Ta FBksoTvᨓVPI$Cی|l6dw*zCyXFGw@h,`]`.O/{ynKs̡Ԣ6s/E䓉YhP>#ѯ,NgN['+|7v ϭ1Hek/3ڽw_=p8d| $E49X۪Akߙ‹N}+5}~Wѧ戲Y{~{_ 8͖\ڑ}ˎ ݲoBUD|։D iu76| v|LqUfi*3Dai-TߵrBܱUU+V), 攰GRh%ǖB|8JJnN0g@+fH+fFz !L:+TjmpP9;ٗ`bu;0XrϟX`OS4z˺Q\١uAՉ}.֭+`Z⩭[ѕ-|*SDMFuhbeFub߱u2ٳOmBn)4+W(Sf*wN=!`Z1QalAΙT܄rY.[~CFGkN~\-+EXo&_9l8[;6dx,&hPiwy=l<#w5ߝR~zW/{vXY{v:?Ce)BDצw؂"zus@(%( WCrT$wIJhʫ%i.8a7 #ДةPIݩg0iI,Wg,\[K28E9)zwU?(Z+u7<:&ITJbMz/.=6D n3,@ƈ6j~5a;cQ3J3SL}@'@֪%&jAzAHB(:Io9!զtQ{(QaL\LP% y^h )ˋkr^`.=#\azn"F> nV{;ZȰ֞ :]F(K vkE6D\r We-"(U=׫ #tH%<؛1 d9rż#LB@Xe̘"+1Zx- 66{7CbE4mxD^\\>x+N7D^T[Ӕ' }ZM !.=^_)Wϲ36mEv{g5կNIQ_j_PD>'vc.]$^FqIM\.AE.KMQPAwx9 G a½$T!049[\I Nk;.4efakmI/ws;Q2sw؝K0>Y$yf3߯mJ(GX`6qduկ(M<{8| G:$Ǒݚ3$Ψ00ڲHЁR "Hd ΕL |g6'PyO+^>,B 'U޼]|݉RKhQ ڗſ^# NKY/'~}3»no@K8vb%n.Ѽr-!K5!y>ɵԼLf("l,4脪,I2?q4&1K-#-5. Gm YFx/5/P3~<܈R&*KqB,(X]kdR) 2M-Q@"$ITtFILIŷ(6n4JJ8}yȥ82c(\ ^jYQg`w>yru/},b*9/Q7lhjݾ}gn}103䀰A/x&JKgV 2{7a&B8Q4_u=/w; Dxw"#3Sd_c|XsA=m*g~wcy?􉽕Uhzl{i6:.uxODI,V)D.iEE7nb%YK#1P6q6B2"5x-Na 1g*u< O8CEe*InfȉR óZjE6@*~.U Uhe, Qlh$0zB6FFo<# *tײi-xbBުOL%ZuUsRc)%;ō0Bt(z$o)\eqƵSNŸ"Cc2OyW m~_P$D&!\T֜|S0_ * Hb| A{ULS 9m`\Wx 0ކ97xApY[P źОsCT &e]ٕ8b˷݃= ]Mu&52|jW_3G]k7`*b>o=vѢodsa[& -MCnS%"2'Xo' "pӢnx@ j54UT)T֝|?t_McCX \hIku4/u٬[ z+\jTrΊw~ڤϋ2Zr+a˯ޏϷcxEG\ -&ǖVeW.j2vI+JTqtA:vVYk]"55w_[Z :Lƽ΋VqKZ7k}>@pnҊ+^ )!DSќطrXF`Х סE~ kCX+ou.0ҽ WX~,T)4vtJpԖN.\5)mufp[b5M]jHTq c/՞ug.du;šh42m)ͣ2t9@;Ak5|ovW]nS0>StfDJ sRO[3BjWP_tHMWPtqy-j w \az0KT+5:mJ_n~kQĉVu0gPm ~j`cw8]\z8Ն }wܸ`R_WrA..]Aw8eCosِx3eް)M6M&3{-?_ǻ<'QW܅sӒ)٣[.4A; ݆Ӊ?B~̩G]ߕ9?S9ujp[`x9{rIWtEso6ܗ/S@9wvQeQBsaЈAdCoDBywnLe{-釧vk!9DC0%K]8׵ZP:﨣NxA|zj&Ct SmRl|4\|zN:8ȡqڛ̦_?ra}ƯYL7_nWE}||x})Z;dRn7ݾk>Oضyy5t=5;Rwj-z9bf _h4΢QϾBTQ4*;aFA}{{Vݽy]d6w劐_ll?KZ^68A_}8խKkHT)T_`6v{RLUIGO |d@w L[u)=v tBQGִ[djOBsaJ{ }'RCywn'EhZulQ_Bsaxb ̂ =ʂyBU'n놸5Ct a+޾xq FM%#;xѝ]%&t6,=`J8dnw|%˗Q?JM*aq?YIkxLD+%9a??؇bzkiPD!{-lI#)BDKݓI[MOo7DvQJ  . Ml9IiX,9+͊\y*ҬX=UiX,@;# uJBHJU+MD4!)YglJ#")Gڷ(MD1rSҤCI|I ٷT %tvS*S ͩRnN}{4mqR#5z9|cx3!{ORoKN#QTo_P+aIr^{_7vNvdwdbgRfKA(PoڟN4lcZJyLLѴ$S2:De׿>3GE&kCrP}_tJFB%d+5%-b;He4evR=(Uy%N@)i[3O֞%hOr K Wͨ{/91)͎$:+4##MSGʼ$,>NKf/AƩhX<ӈl0!uòXcKn41z8C=7AZVZfTiJŲX[2$"QBcS*]݄HĚi"`2\)b"R\Z9z1t_qn0 ``~(cMqdr¹DGeR45&Zd&jxn{VJӭ`W־gd⶝.Wa 5 ;QZ}oC]]w UMq&KH;XlŶ]b+V6,3քWn]d4˯Dz^pmzW5|gQh:&'wkc9smy?zp.2VVrKf~~?/!Z)xwAv}%R7`@ 7 J9ujpsሣ , @,\ ֑,:7= [JSF=o Lp,!1bFLTe1!Z06hpԴh6:MQ|"Y; בWhn'fy`R/^ݝO]+rOȽM ^Nϲ;k8b.Ъ~Y,WW!kb+OQoL{+o c< ;K} LA8S;)Ô0$<%Wq_hV79iĈ8/FnpB8jJZDDܦ"ΥMT.TG"A36p`QT%3ʲԆa`f} [p? >ܐR~)Q5P1J5>(urrvj d.(=Gz5!JW*2ORK\Pz()a(!#J (RRf.~yTxZ|[|c(Q SQQL>(urvj#J0.(=G2RtaKwS}F5..2u]qAyTxZ"OJ/\"QRFGnCvj ]F)#~(e$=t4ڔQʙJ9+n˪(E@ 陣/˨PХ~Q}XFܞ/_*Y'P*z}xiܛ?E/.A%"ٷ\ %u# 3GqJ%BhM\ . ŷAz?IPkZ+8#0R x'l+ 5ywmmHWziL\}X[33cGw@Ulm˒2 \@nD D"LZchW٥kY,@(%b htlQY,Ӳ:ϧ9}Me_O8\*Bt$% `vUڏ:hI#$)C=)F>dJ~\yw)멝/>3 }5y3gKw8flDɗW*C(OOOg^oMnH}}ʗ ь!"ZK6 ޭ/ aEЁ5sn#X:jyMdϝ.4 -aeb  U|lo\/BSn6|=Ť+/_M$e%o򲰓NXz>VO/+Q-NUbł,@uTuMD[mkLl4TUʝǞ},_lR7w~sa,=_uOvV"R֒&S*95zúhR1Q6bp~TY@ֆ2zsӺ-[*!FupAARٛuKNhА#W5ҁ:3AFYPxuz}eמgε|pCPfً_eb tb1H e&'͇+/gN>~j#}YQ[|52{%/uzj %$H:HaS$7,"ΓvjLC%kh3^OAwC_DF/ſYxq;Y@SE=ghy.*z^TV kYh!r'ǐvt^ jq19hKz Aٙ;|"RH^Nwei㒢Q&+bd'!6'u`UF7c%gCo 3)g|Z,]Ru GD%U˜wUUY Tyudwz;-Ռ e\'g$^ֺ>`9`rJs^seT*-KP#T!"YH85<`Ty9㠌[*%[P[ Hϩ%Z%+2EŔ0f@0QKN5jgr<1929Ł '&vNISty #i:nsx LI2 xMܘHtBǍa<_´=a4 IrpROF]41wKfF`@|-42t5ATt/^#p;zu):۔90#tAmokN` EvCMucmZAAEj1+íƸ2!GI:U}<֕zӺZ}!1"Kl#-o8Ts/CZА#Wtusºb:mXwiN?2֭ 9r]) )̨͖v[^1V%yϭG "AD";]`C,4&WI~P`0Cfh~7GKMzIXuv. ȪH nn9T +KE^ʹ(-ή Ԑ-m(cwcl7Sj}V+%pu"3$lޑXʱ+YvF%hZ'@VbAHDyBtڜsޝݺUŊ2Ҝ 嫵s>TVSU1V4P4,FT1Ο\\j#̙' W͙, ) 0Z)ֆzXK';ǎYC:ndž kO q :'oU% ᶙ?\.jr>M-}ק+{:{Y \6[| fq< PB"yBH(I!l шd+]PP )vC._pz| ^tm,kEB/~ww=/9sʹXVAsU\m<2~v4d@n$D.". $m<,F!qj VxMÓW =c kps$W]8Q2q 7 AHxNIX)(JIQP1R]B"#- ae@"K)sN "-Lz[T+%#nJFVSۗIr) %ƞk7$#ƘkxH21暧*SObU7IW.0v\-toMqXN~u_(p6mt׹4_Ogoy*ݏn& {;߁ecȾڮ^#ǽ!z͇oM8 BbGw ~-h5 >QN%_HÄ4u*὾5Ԫu s7ؒHJǑ@\ 5wӣTWD[ւ7*"e:x=G^ŨL #zjf}Em7] ΀ʟvʟL!s+h^ gJgXM\l\V͜S -LH XhM ~8u\6)ʼŤ"Wn{'0 DH)BPy[ \Z3"sEsvT3 &m-xkLg£AbƳD: mI#9\ S Gv@Ɩy+#}%B8HO5 .UśbyFV/ KB@e]E(-KV P0C,(G!R,EQMV$%ZxFT47%J[)e,G?!/ "~/vuQ+!].:,czdD!G):kZǺmX7b:mX'SYi-6к!GI:= &uK DuRۈnC"$#Yt@ֆ )Qf"q"{Qxuz}eמg|pcGQfً_eb t$bv3 6)g\w%%=y}Kk19p7́ ]g{8"43| 8,޵8rE_̨3#qu{՗$P;I{m(1?RsNoSMD ! C4 SNP&[G~#Ļ{1A̻3fzz&!)& x7Iwԁt"0IλEJz!Z'c(H)7Sco>o8= 1@,`Z<&uEj]]ϑjB?Rh;zJ@O"5RrAyL!}΢ԽǥHM^R^;S'J!CR"/(=kʉ\ǤJms.(=k+``(%+ڞ( JrǗx(=&uEj/(=c)j{c@&?@YyT2Y\%7z&u?,scItnlG=<~ZH܄y Vfb,"$\{X?۲u>aWoWd= aILRUjέkL2IC}׆bMV} ܧQ b,&.y5kv6 gϖ3ܚ|03%~M1q F{'M ,1/V(Š4G(%͍(۽NeS3L~.CnS`&~}kKFԇG|,8e%y(S[8՛8AqB۶Y~3_9q2FRɏ}sdeѩ.^ca[|*ECEyy.j%Ky$bDŽl,D*AU2x8Lk܄YrYq) o6,Awf[|6y\/rv#bJt $kSlFlt͙ !g;|0؎YL8,b#$w*15b a\EDN's$Eߐ1 !vsaqAea@c)A`>Q@ PRY Ym u($@C,N*4am>/DUo1傼~(CnvHLA gdSR%R4aʼ_Ps1HJŅZ"Oݢ X:=ڤkηNumo _cT.!oM]ԯh|U͊6 osn=f ^d1џ $~;rAm?չٶ?lMیyUכż|s3 E-!8X=w( 7i)^N@`ObO)mp;ܰqK\2 hV)n9Q[VNעW{p3d6f`|>Ss40A'ͱ689jxI(4#6[BlXBs2')x[= NS(SyA9p?Zya9g2|U7JiJPA2?Z &3G)`˥D ӗb–{(%؇ Q.(=kRҗAzX3Jxs)?q+R .wgRP`yRP]8΃8/=.uEj!5J_AdpSRjK%.=g NE)D\z\|?oBRHR0 H-m\np-m%{YɣU2,:MIY"(5I"REAJ(+`9^mgsl6Nۙ8801FS&Brp@@BYR:d{l|>nwO%/O^Ң`Iٲ~zB-_)?<-@nA8ktH!ݙWq{" XZJ$q4j*KR' <8lqp?=%hk<տ(eX^>.^5 '$_O܏wu2_}?ga0ҘqV4y@ݺeҒXc{G4]W0++3 oOG-#V#YGWS e.nʳ`lrx  @_|!mQ,/X'3 9jc /!p‰d*; _Ig`{VONd]$&̬fT ܳl|?}&:b*G[TQg Oa62A4AnApFm}*/ZLloxb^Z}>ļ1b[Af_!~mo͟_>ݹ~Em'Ją*uwt9ұk{T|Wg۔:dt~hN%uBo7nU02]M*d{ԘI К)W!ca1N KlzxΖ/5۪U J- $l<2CqU1 ӍZAx/lbb_}WQzǹ*[J~C৫iwgbi4/WzY, ֦nTDe=BwMa1i/,]qOJFS7e !C4 S ~nD!xX@'!ޭǸ Yg->ӻ5W(La1s׻ 'Nѻ:n}"yOB^9D+̇&xb6ц;X5`,A>Y$luu㟸ex)bel" 9!]kjtEUvqarh~_q hrOqNL#B.DM3ۤ}{ 2ͱ22.Pӹ>c㓩?YSdM5bv?'r@b¥i9G J4 U2A* ݎCq0MGe{]h}JܔY`n}g<3mOyD&lTR%BfAT2BI)u28S3[s%eC*#XA*P{d&tIXmCS;LXcGSl"[dzQ (E:RyF<dYJ0Γ H@ٺv$I0WA*X*DB~jQ+'s2 @nզr`%iuǑHKŨR*e^#OMG@Hs@1s7.J"hM J%3zGB y؞x aMLgH(R 9+n }fyRluѣ` |9MwHM -Z)tGx=921"Y #@CEƱz~;)+N":hzF B~,$?dEsH,Q_f]WO.{7Jd>Z5fc]9.Zg[?lr:OWĪ|Rۼuf"5{bxgUy2ahvN*9S5=]{?*F%5#uچvUGDZuww>~f|[WwCT-q8&SSߺ=&3"#?p*oʌiMesVhSc):\Z_w|{m%(CU18ФPQ ұ2XNZqIN.7ݦi\[@>g._ 0%CxڢaAJa`WKj̟StnC"e@1#0T\n6?yygl{|p$E g k$@, HF9I"LR&0s&,zo7;~7ޑͷП>e>^16ÕmE UD{+UYm !$0tz؟SƯM\kT( N>(0 FH$11'0Z C!p)1WD%syJ㔊pͣf\d"`lN2gJ#_a]w8~e(Taα#KcNt$=- >K__+ .u;`Ur+>x#be;Qr|v2fSv(j4wcE%Ѣ}Xe0@Žm@ZT'f1sRI-Xe(3\E,f>>*wFStAiܢ@${.Rxj_KrԞo<w-?o cf:2/qĉ>x2 .urbS)`:=cRI:`-}".(S\^*.VLHZmLkueȢ49i)*,}vM! =]`JI=uD[aH$1:CȌqcB9)o%d hAN/ Bm[0B"42<s Ш7H{dpaRKi ?DsV`Dqi4 j?5=9(C 99 ebRq:8<'ŅTg֩hfLQ.iv&d $h&+U*z!L~2Z|b$ RjIk7ƭɥAPbIO0a@ |}j)JxG ﮗ;I&䘸[gek6ge7nrWVv8q**: w n J= ҀJݷV/t} 84V6 &N@p1YG|(aWhD&a{|OoBc>vҦu=!Y8=ieӃsh t́jS:4zK#4ɚn3:hQQFkYQ~8/RQe\5.6 Tc(44!Hi$IX'sםNAo=b(3\#/v}۲Io gslϖ[-Wlr՗@/J8u֪6X[YBukbR7ԭo|> [~X p:0MFUv76W+q*7vf' /{kr*~Ǿ^xz ,zR&8g7.| ?W5o˻Xnow~[z͵y#Z{cJ3?>{E6D8wOSx/oG _"%4d*ߞSjB@ ų+murƎljulJKkӭ^խ}0KIիDMzfiZkCb V#zxf)4rR &CVVkD9Y ,b,y1YzUjdP=ؘYz,MċEI4mkm5>V@Yz,%RRjjLzUyujfT^yi|.[ ի g4KHc1KHcibNi4q|.Tk`)`KKc1'ҴDSY&aLMX~)3s,=mNc)KIիFYz,U .rWrf.NN'5@!W)'?SpAGLr͎ߧ9^^MrОo {Tq"/5(+"zĥ n#ҖG0#dG& Oo1U9lANҒ|r j'Dx'(o%ωN<qm4Yr[1,7@*;I=uG5Piynsϕ!Xgv(^3O Qi&BPTr+aPzُ.@qo;m8 PD.շ?]~9jz)~h39"8xPZM|$|G2rɴ6y$y.uaf Lኜ9`X oC' QTwPɱz"'*CƬ( ¤!hmPDUwQ)weoOw!t]@uY~^kC%,?oz-Hշ߯ߟSi>ĺ_.? `bQ]~˛0^^n]Ylן0 >#(KIZK@gWO(u}*`g,X>} H@V&oO``N0_%3Oޝ/5CP[:ҡbIvJ2L7=&0cM4 \=ӱ):i@4Rm41bEe N]⣨8B^2 9"^*՛6+@]C>ұa&J5ԛW=ቍnz@MOzxnç;QLo-m&Qbիa.I6a)rSQ)t1_v?=xmơ n$fhbDmחb݀ mNگ4mxJ']e7]g 4PYX b⎷HySw)LEy^oLh)EȷQoߞe裬58 HeP9ApZqT ՟{{Dž G[ Lc+L 8V9Q-)*mDyTi]B B]0S X%-)DP7D+`ѭ}dbt!/84VtUi[[ RMۈ.ʎ}G[ yq$r,9[֛Cr檰=w_>_y_wj}|&f7FM.@~Hq!(>%2pv.9$%p.9 [.A&KvK5t VF'4*K֢9c֫Fo}09Rz'KIO\Ec_x5 +Vkeف5*% R^Z 4r6˗p$9g7NQ <$=%|pbZq#bX@_P,sjޤ[# X8/k\Ls`= A74-JІDc׹D8i`5p FtYGÊGeUN, gr˥e1:Stsng,˥΀xar^zdAzjjTEMR#S@pmU \,ӄIBV AgdHXP\F}%BzSﺏiSyZCيCZ [0WZ<'.Dt 2Sx99cV˽cX !-9\gk y\6%-)uxY@`',^:a(RX@9k0%Z7|,kcYR%i,t9Wya"6L3l&%Bgab(s BIyE 1,-[n1JR⻥[di<HtKyPe=l ]Vk5V42VU om@nl9ǏwXlfDiVXŞoow'N{WtVΜIrюӅw2]$r9|:3/1%jx@,Bl*2/#9ucD FHOnM*WFZu{MLU roNp͉;+]X\]\dw5U=5Ej|J EYMF !cEU<k Ɣ6Z ?klO .gۺBc}߶8TEZxWR[x6Sl~|dZX?̢ăy%Rek͡rc֓ 3NY Cu6G@g“;SaM9+9B*2RcnL2% `GˤTns [7|olV WnoF%KH/yFCh2!1ʀ YXIYfX;.R(X&%l#0a=/CRs=1S=ŠqG91=Ւi605#8@2oTjGt$,lgؒ3FFvDtwSr%B{X ?OroR(PK_\XlVSuWWaK(S70[|[R}khYI}>~g+2,ē`ꗻ';1ʗ?.֪Hjcqoxۦ9>9o_iBBokBLb̄O~>na $%9 FԢMUs+p K(L8`;7Ń{=4d5~ڻPC,wvSwn⽹i: cWwnC]ϑ3ht-s7Ĉ9bt Z<%er<|IBTeZc%~\9θTVs,ft(4 ?{֍ K/Ҹ6WaLMmfJcX!z%lGx g-X%V>݀\n%zۆi"{t0G1W[Wc;H]&b:wu'3Gیr _ব!ϵq15{j \3?^~q%K&OIȃƿ>:%\A<'Ǿz] ]vNk4{P<9`BD& 8ǻd%2f( t7D= hX8$mEr\)x( Udx"뤇wE֏/~8!Β娷Œq0I 3,z8lc&^/Ho'toWܙS`{uof2w'6]7  ;IJ1[`g ́IlgdOvim*I CUu+ȚXMp uo8N6@(5M bDh`[b*M>dita&ՒGܶw"e٩d|Ss?A[n̲ew:_ˌu&|m.2c>zccfo9&`o4r{no/v1-To6~]Ly{Lף^{nKy-*\ڢ4W&:*zOTh֭i3`[Ӊ֭|*H*X='M16 ߭))3rېQbfݚφOnmh7:u0ɟ|ZfnK;Q\8fr=R;ydFf^|9=y8/^>U<I%Z}vfeQDS!2n*=QS$e-OyCs%!P7l)sKt@"0RP8CfYMs- *`,3j`[!#%hqL e7gsW9,ý'ܒ33SӦk@]dmva~\6c9B4.$x7S?CZl*zlkQ۲$/]h08]=װ?)vU#?EBsY#^ݠN>\ut]% AJ7t zUM`:*^5k#[^3we*2X@}+&kWKe^4-NnlhrLb*Sgc/ $+bpwe6HѬ˂%ʂEYw+|@#Ƈ>XxyT\YTA^aL]xFJ7|g #[5#Z)·P1w4ҿݥ_ZA#IRIUH-ni ;&7oQъ?nVfkFkDfxVe(#[KS+Zi ! ( 2#檕LdycB 5To" g,&*_OF8E)Ff7 LWPLo9@F&Rv:wB#&pX`(!H|JQa 8Q5[m=_BVTc'ؙ K)i,(QIQHNmֱ/}0ccĞmqez ,E½יּd-Kn>v+:q-,tɛ~ ڣoo?_]Xx*RZIkE0&B!QO(3T,Utr}@0ʇX)Y"6voiIޔ%?ɄˤotpPCvvj4}; proUlV lTR2="ڷT\8oVv3&>-|p{[v7Lzg7?y\,Y?/ӫ@=isL5vxZHC(u$Mw|-0Vf|Ny5o!ܟjץ& jECsèQ$lwbԳbrX=Rhd#QhjɃqW2X=wu ʋg!p UP)Qr8Սd"b# I"dF("@He:Y(s; QFbk)cǫF`U܃KT[lծ\(hnL%ǎhCFAbL,uSk!/AI #d\cpPR,7QY4jeѯ&?8?wS'aaSZ닇_뻋k󋛿)}ޯ~A&$ 3@!2I!UB[αK%(@SU(1*(De"@>H1ǒJ f{3@cF2C90#M!$CC6z  g |n#Sfuy.t|ob{sW転LL-/ػe(Q[%.?~Udhz@>( /}3w~:WH.oNO"0EdZ?9 Fe~XI]y dARx韃H$8=\_Wj 1!,x>'+E@N&+ m0(};⌆}_wKiPW݈Xb쳼 ,*X\ y,p;CGqKZ!~I](%pE x"3l9P15iQu4p/.+d=v`Ã@ rZV)COc<0/5;K@x@ 1 aiq c6X^=Ӂ=l1 <v^K"&[)ZLsqmc4䊑ڽw !B[hϗSLw|}) T퉳!dL=͓`ޒPM$w&s_V J)(,~p`1#4"35G۠4 a*Cڱ}TmR@öci)eQQh>6f1y҃R0Z*9ljd^5J\sQKUs8-O尵4͛-qh)4-TwRhKiRbWTjQKQKMRVFQh)iZfXZvN@:6w^TۋZi=ۤJuԺwH&AἎޙ"aPb7"[(bgaH*CT^)ۆdD!I,PNzP+> s7[m%˻BQy(k5 Ω*YimaP9vyZ)~;qe E254KU\;KЎxf$iP"AY-Ყ5G)6۵TTv|Rd#k}B2rr$/1 Og N@.AJɵڬ]UO3k w)ǘ /Q% RNn#<`9eB$5NH{>s6FXD:_3AAbi 7ڄC|DQdeEGEge}m1zعt-~kJh_Wo ③^0Ƭ^"|HI9GC߽fAw[")}{  D8DPVjpPEBOa[3&(2`5g7ۦ㄀B"l8ۧH>I:L0fNǹw( CςI,iq`Fȓ9JMY AԌ_@T1^;n}6ֶS%a!In&K㬐Y c*KdP0] z>ꨫ]ځI 'c TmP$^Z QQtN͇QJpƮ0Y#{T6 uSiyHh }#\!z4KxfϦ7T $HۨViG"7ԣeI|w)n'Y 9pGRdG$ p#nG9#lAf T:5W0PIa$Z6ɅPu[ aN>nˆ*\jƫnOٻ8nLeGH:G%~Ju,efd'{ȞԚ'9kؖ61?\ntB.i+wGH̉~Iqo^U}oM& 3Pݝ[4-*Rg>[+fsAE[)4JjwLn~/߹,*K (Cz%$%3 2I&>^ TGȰBo4= mzƅ#&#ŭ =bɎws1֫ gbuV+vhBY焀k@qjY@E|+{8%2w s8b ʴ rM"F9rB0pBFؑ1yփ/fY!*¡PkY!@a twΌMS噐h%/ r+8q?@Ah%Virsce)pF4NG;R*JuEjνiޭLŊJ[ӫ)8yI8 @)g^m+"Q.oL̟5D51=3@ڱ>ͱ wޔBjw <:k*R4ήXb%f]ZHtl/gFA4X1霙f@R#{+A^AܠZ <^[äc+H(cVT]k/pSҰx\Z2R*xz2dCJ)0)Z|',HiQx&P}N58_>m)E R,<|R&K8ۥ-\I)JQH>/רy?m) ;V-zR*(LJE1BAU-OcХ\KO^JI)8 )Z!G&W4dЋċ .) `NM*u(Sc rsn2YW1bE"A|ʴdU#&䪔xq딚z6 ]nY(qkV*< 8Knҝ\VZ GOZkĢtzNN]=>Rho$BJ8CIjwQ R[dcKq]kvYIBߛXWJ}QYA,gŊ;gW׋k;oy=m8LWdF_#搜yᄳxO}E8Wڮr^ڮtَ:tBxzzm`8փ?׈XTH'3M/d/"rߥ{*߲I%U?m=SD~K/ ]%%˔L)֦mYmAԩ-;f*7 &-d(?//z;ND0W[ۜeQshk/E-P]: 5kAҦYs(L)WHV*Ŋ’F B8Yyɸ鎚@h3~pp5 wfc(lqzn[F*bb?_+5OYWYKo7!r<}epdaJr RX;1"%erX9fXeeZLdVR{La!M 2^Y6nPUn2$SÅ@ ֡+4Y@Y]DZK&e3I 0BHN\ TCgU]:T<}4>|XR!_?G@ T,⏚}_Eq )0azy%jfb;뺲ɕ3ĽxQFvǩ|vܽhl.'c5wz9K`Q$p]8:˙p(ïض,ڍ.zl5ѓM=3Ar;0(Z|Э 3{ ,Z_#\s%n{lĪnjvϯLiFrK*VyKE>޹6ZZqA88fssqW$'5ܷq8mҋ%t덮{5X" /bK$xO\3BdkԮ]Ÿ.|A3dOQvZV^/cp?Kvo{e޽.nի/iջKfio#V_DZ{|@zu:3<0:mP2iF=peX%‰ Q30vPr+#EiVw|0s-d/+Xئ=\і'3?Of$ǹ# ܒ+ZPDzQ0-@PXS .62*2?pc H%5Z<-ןP#6>f7Zc-d\i(!=j\@7j(<:4?u:kT޼&XZ)b !#l)[#FiƠy)OMzQ|ߦw+uIQr{enRxA]iSQߚNL5$6BPlyj1ϔ)la 7m[=X"y [XBbpJj=Upp*zT\mY3C^%l?I(lAD ho6CKkdax5pUT|R2y~sGt}v@ߧrvw;˼>y&?sE:gFd !gܒIc֢sW (mBu˔%y83#=/2;R 0;iRYB^^KwVZkJަ)JR:Y¦E)sӨۘ.5f ~(fhfF̌u'24p*loUX:_54w7CˋqГ_3xYHtVhԀ]u`9wE=u:\1]qlGɀȓI>}9:Zvk]Uɷ`Bٲُx=<ɢA0JAgPnknPnaQZH8p}QcUZ?{ƭ K/[C;8yIcTHʹ_`H;ȈJh|h4 ~z?-"0WwynަuJEQ&V$⻓A_ 'S@MR4n/i^Ҹq{U;oS pl㵷!،8KCsŖ4Z9F5ux<**Iu;zy_D1AiggѯfڝȇcńcTXӟZX?yX^򄪈o;zlO". [;[z/i,>o\烾Lz!Fx.u S:G!~<Ô)>5唋U  ۏ7w"!`xᰕ5 ̀T2VUlbe߸ ZH43{*5?W(t^i+I,= }7h|R~2.>σqqp^sm\[ѯ^qq]nj r .\>0.1B5EFo6xޔAӉLO>jZ\-`K~ONgQY<>S2`v Vއ^|X}M*@Gg8v&~ԿK ,owZ{$Ʋ$1UKZ ey}4W2((2C-&⠂gC5R1x-$ {f02zP&mg^c 2/- UӒ7ʅB )r~ZP!**SZ61A2*W\cwH -i^Ii6N9Gز@h\w-\*!ꝇ7_UM|%/}C^oﮯY't>U\=gwA\(⧸(vz]~g'ӯQͱ~8'7і.pv?(ԕqKE.}qSTH81B]^³yO*.~Tm>#-ngdT,<8nhTsӅJjOōĭ!Lھm5d>pc2-]TE3Ū;Jzg'`K m Z%\#gxZ2Uz;[׻t2noѭv+Tb94hTQq B/('S1w)zj M-B0Ba0 \⎖FŖ>}U<}9l5#S$)Ҟy8Ɏϋ ?_ѣ7IAEGqVLQ숥fKV~L?֛?/VHr܂:ظ08\ˣ4[PY;K O=CGdȜ`Räɵ~;biOI]zE@P&n1r'ۊ.NY[_ք˶ / &)GP1Vh ס\2L 7&+J9$M+=^ QPYP3eE-cW1rzsdؔ0e'œ3 3%wu!~~3cś-}/9<<v"ŗ03a9_+`Ⱦ8#PW΋BI@FzXΖy3Օ9rbRσ/R0t5WJlcF`8lߔA ,nY|T)Y;>8Fgx&UG!~ScG z@w Ψ0jsjG0U\m̪~erNPL=2\h(NV(TѨ$M0I9QrN`<a.۪P'Dm^݉DMbwn;Eŵ kW*v*q Y]~q8ݬЋ.FP*vq%fpC|1~È=~^0qS<`^ r__.~:u^nBNK\(qW>u/LRdpBDxk^v)ev>rY9e-rf!`K$"&He1l9SM IQ|e9$O P@△.ս\lՋעA L%MN'1]d3+$.G:VN[  !@Ð'ǝCpXB!$mHGmØ KV2R͈QGјD*1"49%yUjپ-'4I_xmZ %x߼΋Z4R;O9@u2c]B 6IGh)gRr! Ն9xq1#Ӛ""4QBR tڄx'QBdsSx'$O8)/hxW1;  #zP@R^^zCbx&ۆQ}lanճgdӅF{AD\B7߆qZ1~dzy;cf4lfx=EpJ62`3PάG=O(gxmTu94ݳkX擤foesiqܙRi/>_xUv84v#Wv4NV,XYt5oݞiZgYjJ^Ӻ/2r˩YLJJyT 3v0)WtU=ISNpPƿJܯwvV=Ѯk4*-f\HQVy_?׫ ߼l|n3Ӏ^Oʷ06J<ӑ#e=&<$-Eo \)ɿ<<)N\/fa=ڙ;nkCELaO~[nڭ.9SCY-8[j&$䅋Le܍U3Oӛyi&Ix3},N9O7$*F5˻j/1㊵U_p}Q*ēA :L?q8d)&.?hL9l/̸ d#0Ϗ $$'73hRV]twD>@e.ƻ4m@v) m薱h4Mt>X#;^w4K:ʳX1}rcqKAG$X3ʰ?O bx{v'Gͦ'ǥ{|xAgv<4E ?$|4 n,Q4FBXEޫjh-3E9:+Pٽe27WH?{F!_*|,p;`2Gxme^KJjI6[-Ɏ$vbz􈽰>ONqb x%H9˶ ðrpl 5ArRϡ_v{lIUrAWN 8ؕZ#WeiXWs#GTU+>M?O|ImO8]Bw̅Un!&wJ2Ke,Mi\!i`-i .AC |-$v&F'?fz}ͮci=@V2ren̚m|]>+eJhЏ}`KL;Dh9E05SYg,ꄁPF ƈP SN`AllݿSy='H|ˆߖz=V,S1;_$!R,&W9j5G;q `8.m/vfm<+ۯWoEۘ@p]Xhr% pOۧ1Ppm;O;^;8JKNQֈEXPSnNKE\#O,BMfL2Juje˖q 8l~, ;ne^d^׿<߶PCuZFE_l%d #H4`I:1H6KinzD- U2sbZ8e:TA!)qP% zAT ʪ+&yLH__@qS2\WeUʠ!TW{9fat^DݹBٓ UyA\*VGH b|< ͌LR/Ktӓ&[kM`j5#As\,} H kp! )kv5*/0׊XMHnh$h#T RL)؎{b9Å**]} 'qz@ Ɵ>\^uU*pV-E:T iԎ=TiS|>B:pTh61=|Z[4NGHR U:^;841N:G4R3|1BzSׇ8"-pM5Eeʃ3Ie[7aM-( P2;I'B,̦iN+JGw#g. 6u=ʻh?wvoKbSd(x&m5RȂNk9`$+P $ŽYHy3 G +0j R`7JDNx/38r/ t@ݭ咵+ QMKrTaj2SظK]K禊]gsVHy1i.J!GR?lv۟|JVuaY,Ϋ֓wgΒ&Ζ|Ʌ܄//7iÞ'|~}5p. y, d)Y_eCϾ~ϱ?2d=|2ywͫ;bd!dx Š\%2UlFiC㖐-o^M|Y9q!X䅼"^禹YF+yBFYxYȋOY |'ίO>f=7*74IZh.DAs\B˸ֻ=0JneAKVBW p= #S_}D炙MnmZZS@:UY.O}xfPltWfT;Nޥfafw%.AoXÂ7yAse @"G7/B vt_K|Bꃨ&ݸ,1D^s{wr4<P$OsE0A6߯V !X@N p̠{P?g'D9A T-"{'U/hBWh*y<+6Y1]`6 Fk'MMprreW.?w*`/sp[-(2^&j(}֓ mg8 2Y#䍨aKh0Fas^:F2es6bb9$^.!FqPHBjj,D<+2^YaT@ DJi%qCN=wcc 6qrWu؎/:! V㘨kPגjt)M"!RZj.kYXh7)ov_.Ac[v;B xAf|݊B`T&BYW[%xmCm>w7>.q BnD#{Zug w JU`}AJLtF&ɶζc۪}p6n eA:q%cп]Aw.5/:9>=*8H)fROߞtQo)Co}r8Hh* ޾۸v.'B3-8'DB"r(TJ -J[(VA^I]eTڂvj}R6w*ynQx)W6ZJH,BDf8 )+ŀ c@&)w*ߵւ v5"H׌,&ȸ+#@;"c;p'jkօTs Jvho>UNvf>iW_@#^_0a5Y42TDT@f/Ÿ٧\PACj85d! w__~$K"oc%048vq k8[c1F/ؕl> O* W)cQ/-l.Y|3OxK`!OV*8պF7gV58=!] %?;9]̇N*ž܌~ޫ!-S!|ȀH Y!dαy,.V3RYr S\c(j "f!Zb!}06?7E)Y_Fu流 ck.'fC]3ETyҊ#8mӁj M$QT++YJRͶB E.Mʼ2 ̑p2:,zNi]Gmkn\i' B9܀Q((k5&|j 8B 6K$?.|P &(-\W;SШdD͙\]9H[C`nUyWK;7VG6&oRj)(9幡 wtu|r. 1>=5LiEÒ 6SL+R{:>܋ޭ/'1- HOؕU~DFhϲ O\PMApWJr(jr H2ap!`{J15Rcf.4! ` fW]`va7a- mF4Xfsv l!2C'umRٛK)5̘{GPS&b0ZZ}9e>.>j{W)BOeC?aF0v*agz2oẎKü4nR٪h5Kqbsq4IQvb믥v&h=wp.'p <=@ خYޫG⣁`4e^‰YCp7+54\~STvRTKް5EUvҫ>>}X\ԇ Bj-׌g/ S3,C ,R8  dR,!8z$eRNvϩv7cXXyvZ԰[s:U,wcۛbnrX~]_O&fW?Z7KO.}5Z/1櫔:}8Q GvA/+oﵧQ#uV/z:C`؋>lj&ջ!|Yc@Y{( Y+lˤTiZZf=kyK(R15kj ޮWeLJus}۞S% DBʾL&K4GgI;BK.} O&Ka%{vn'@ _k5xu8owU&uU͸0& 1b~w؈-"8Vf!=TĠ-)Ϛ)ZgP5;#OAAr4VaF3kzݥ#g炇/ܷ?ך..6E0֒Apr΃NF$ Qs15juRĸl,DF1`oow+N|wp)2 Y8YZ)y'Ʉ%VKĄeYq$B1P5cw㺽dT<'!U26M_dѱdFZS1L,e%%csL ɍQ"me%cˁ"jT^w{'Ktn+ƚt VNΛݎ@A`bK P)bG _][Sɒ+q#J%*뼞q6q$hn Bj[}Y8=M% canO7lk.v=Nfјz*M<nox%ݠz&hr/ݓ8ʹS.C[Kղ掯waǨfmu~#Twޮ]~)~S^o~vY7}vwjV. jk߁jΊ~|}ٖ4hLєþu؝ =>Lj:J'R9Elrc 7C\DPzʣk9:^2НdZq1j߮}!47x0 m [2 YTEi9ۻl!ڴc  ]H͹b-%/ŁR*tmŨ)GbyS,UeKUG4Yj>@joO%u`%b(&#ymH[Fc$,H[V԰)uKQ[|\gK y_{Q\|vL, իb)87oW(Kӿ ɫ9wW9-fͲ FYFy=-&!u4#&!q2/pz;F<d=F#riIO36`n}ŏku&SҜ]ίmL_'z:!dGg!ǻ!+05?+p~ &i<__/涇'1mHV{߮;ͺWq'Zn5M@E$"YFb/<ŭ_Xy^M3b-|Hװw|\#Y˩־Q˚/ȋW"_)l_[ں0M;l ّ|epHyeQϩvRUHC!aɦ,]ДvZYcb!*`bFNy+PP3$g~J4SGNK A/~S|u 직f/E Is=Tr"WPUjYTVI]k ?d3W/UQ"(>-U 9%WX3:"(۠*7`22Yl|MjChZkS>F>j@M(Nn?@+VhOT¨67{G3ԯLاl@َF{߳Z<\GgXwCA[l6 ?G1VtCvΐ=VccnKUEXK?j"k;_KwK}!5Jꞏ>M,}hsA3(xf ( 1DǴkJ%aXԆZZ--Ŗ^)x[;4R4g?+?n|?gZ~{~}|w6lq $%JIX/o;պEĒE%^%A\U.N7/O䄟zd@v O SkGl dǃ*W&YȎK[ȲzQ!FR:ş<P4N:Z>D*өga9y +xN\t+a՟۩ŒE? B݇ "3N$>C`bp E䑋ÍDB~zV6khhC˫'=n e|PUK(V0i#6&PT2?YQƧ=튶=/h{?3kd>X;+搫=! EPܾ7U(~y$ &ZKt 1Xi s09nVFmj](]wI})_G,]-Dž K- i{iO \+o԰ߢ:t1'_RB7KH0?+Պ.E_ա&[kUU |z$O=.z$Ѳd.`t~ϐfxstKU%cI19HFxl*cӇ*2 R4nyr`O?jW'K`mPRE*I;PIz0\(<SX!d\sXr.#ͩq(#)=fccf??OSЉjg~s\QvǞ('7v,Eڍ-I3uXK*A)UҤUl*ђCGbVaws0:6:(Y]Z*& +tB:?ZtY\,g^9[_v.HN4JUImp(R> y1m|je{{aw?=yC1S 0x7􏓥gg^u)xA g} S{Eg`oOvsm@.n ɘJTP˞oECh|g<9VQ\o|ğhp.X' fҌ=i̻DkAFžre[izAlb<9_2ɫ.6-X[oF_0/We;UQ#V." Ky XEbZ(UIPpήM=I9WJw!k‡lW(3}9'Pkx<Ӻ L$7RXo\opI|P9uqƒaON6F 6s};e._m5lz];pK`gOģ^RSU"^-ѐg;x|s hH={V~u+zP_q9'^yRYҙ5nPv5` Phkb{3>g&5vס*#`a*Z!`( ?zԣ_ܷ <ijX[bϙ]25sV@P7hGeFy;\"jI\՛+߹,ǂT:њƋҫ( Rh% X춽6#O9)3Ed3v6:bZqq<Xt(V. 4ُ9JB:s)e/vry'3Xtk1YU <Ãm)R8N/R]C7khogv[*4^rDoΰ+$ Y)i: $UUPKį#>ݐQ~Eǎ^AWhf3k 7ST %?Y5n(h#z(JNQZFvk>yi+YrEH_ڶ9.(sΖV-h S;f}j=GsJ2叫!&R9UgdP>+S pV?Y)IL6vpFQ<8h]c]0!lȨ>CX8J#V QFy5}TΔU hD+)ExCjnDTCqYiuȇE˭PyPrB4vHGֵ[7>tLVC(B}3̶ - 36ȓKs-7"ܑZ#r wڨ=[i8JxC#(fzد6{k/ez$#nL`~wwk T) [\1%ސf%TcÎ!Gm5[ ߜ5B Mt(T jդs՜3zqSg@1.QRRkpaК?{Wȑ 1Uh0,anmYyhKGo$HRAآcKT2ȈHo@M3 ͗DpQ%WDTt&^[NE材脷,ϩGoޞD-9qv;(t~Bn\lvU.x3iC1?-sJ#C8ɕSׅTX$V$3[ʪ | U'V,o` 8R8[IVB N@Q1;Nq*Xnl[ N4%8;rz@Ι$cl(  rBS ` \g=q(E< s8IC+C*8p JȣFcZ'Ze`(#"Ě1N`k @HBF9?jDKə8jBwG&EQqb2ǜHM(\j e<|e/$HC3/a dždm"sxVaxwflr@^@4cU !ǰS{|j(.XRrBbuv ә%m.s:,¬$zqJ}DЗ?](+D1c3@3#πOGs fWmRUYt{tʕf$VK2Y7֟`^;9Z.O4L'_PJhMw$" Qob}ga=kï{?E=gzC75rPI-CP{}-ԥbiAuh ;ΉŘ3<͏gNYjv2qDI}\|??`kdIxp\W~'`tr3ŖޝKA IƌEïVzX aPao9 ?S۩VEA6.{3p -`&{] ;X35"-txepZ++%ў$޷+7i Rvrtϡc{/>bf MfQ:-~$Uw;RH!ηY)VuA;Jï4ft +ĞfTQAKDa&J5QD5N,큝O$ʀy$ۜ"$t:F /7TZNJ$)eÕw '+ I{`SA^ujrgY،N* ooxxOq"|ʿXΆ o  ╗Q`|11{/VGi)`(1ê e;%b(BRU;ivZUX!(c3yPJr[uH\5)u"?k2t(,IB6qciɑ2γK-W$iC@LΚ|dĒOaXQGVl8.fªckG{(FJ{ۻz``;Xvv C8i#I"$5+ȿ{=011^2NP@l"46(k$xFm# i9Ĝ%iԲ9T3 f -Apxv;̸צy $Rf sV Aej/Kŕ ~{yQT_&OlV֚z/w9%9.99?{͉ߏOo&p \O޽Sa<׳úCM=|w}O5ΉBscwe&ӻݯ1\@L{[ IPrJLzL//wKB!@DՋ|RL >JE-Ț=(R[~t1gcV}6MO+`ۯzHz;!8BE310,OL6I 67Z&ՈplBn7ẃR|1H3\e|򞃳η )S0{Uy@[Z#k߃aVwv~]x^4^h-}=󔬠,FTsjN@Kwi.MٜjPDlNiR`.eE n,&To)8dS(5onHOzT7O\25i la^{az5OK\,ZҐMt tvՓ}?p//|sB0"E6q gkձCW -ffdii)%ݶ0%Kq=OI&Vr#1[ eΥJ@[3@R?ӻB(9豳gO?BC- wZ}}3@0iN|VA1{ +Btz m8Lz_~rcVXhբP\DʛEF~>؋biTi8zѼ2 'c*Wb1W!6*yb19R32Ķ'qBIsTaT}>l|~s!@! ènղxNTRF0.(°c\ spGM/h70֘׽rrQՀKS9w$-d92z>N/޹דJ>D56J ! ³ vl YnX;[1&?-41 nM5嫄w}OU|!OU=+F`T5Ps}շ>DQݸo})|@,Sz# 7'VP%ki|Y`}5%Hu~?Z0څ.7Ju^|i64䅫h#>u 羛b:MǨbwi5D"Ѻ!/\ETEW#M:13vkAcl8Fmwhݚ߭!/\EWt&qAB\k~"6 k+b Л<zMIщ>_2Xsw5wX&j! LAu_7N  } b/˴7Zj. e.H`VZF@">r! hᤈYD#OV5=U50 (7:P9)ՠ;sPʵ bjӨ&: >v)Q<ڱciuUmꄖќbڽҦSpCBJ9RP!17Ab+=E\h ef8[iw-r$t^)LՓh4eF'n\SR;VTH*PiZV(.dx!CS(sO)oX1\>WڨTEkbxVpN3uJ0,A:`b HFv(hV+'  -EV`#fiC)Q &KeA@d7&H֏3vIR+h $ndצZ2BKAmp +I `W~yʎ>,x$i,qrdEłOLL58? eáI9p"%&ʙ l d<.P_0-P84Y h 6ϟP H-ttFiUdT%H6 TV]lP| lQ>\᪪R!/+⚯ U)x`+Tigjkl٨wg&ݙSF-hlT%'kv-@$]x~ym_#f&(S-ÊTS&ﷵӘjhzF/ѝ54rr J[U` EZM1Hi:F"yx>ѝi5f4H y*HV%0xd4#))t*md@;nͷ[UN &JuW({JU\DXnJ],x27%n7&r&53 ԶBc8+^mp~z,2fmQ8cM<_8 Bu|.Õfj//MhF{!dz[TW A'7Ez? >wn4-?+}}C#s= 40HM)4)9 % *%PgVZ n~nR~J-%ljt( 11117ms (09:55:28.035) Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[1064268399]: [11.117793162s] [11.117793162s] END Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.035928 4962 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.036329 4962 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.036398 4962 csr.go:261] certificate signing request csr-s2kkh is approved, waiting to be issued Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.042015 4962 trace.go:236] Trace[871747541]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 09:55:13.577) (total time: 14464ms): Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[871747541]: ---"Objects listed" error: 14464ms (09:55:28.041) Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[871747541]: [14.464075109s] [14.464075109s] END Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.042078 4962 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.042712 4962 apiserver.go:52] "Watching apiserver" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.042718 4962 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.045575 4962 trace.go:236] Trace[810486363]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 09:55:17.833) (total time: 10211ms): Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[810486363]: ---"Objects listed" error: 10209ms (09:55:28.043) Feb 20 09:55:28 crc kubenswrapper[4962]: Trace[810486363]: [10.211979398s] [10.211979398s] END Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.045616 4962 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.047533 4962 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.047997 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.048659 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.048767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049054 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.049177 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049192 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.049314 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049661 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049717 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.049744 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.049820 4962 csr.go:257] certificate signing request csr-s2kkh is issued Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.052726 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.052900 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.053353 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.053480 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.053514 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.053927 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.054252 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.054405 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.054917 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.054997 4962 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.062913 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:59:03.33614956 +0000 UTC Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.084498 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37862->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.084557 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37862->192.168.126.11:17697: read: connection reset by peer" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.084555 4962 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37868->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.084647 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37868->192.168.126.11:17697: read: connection reset by peer" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.092585 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.105044 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.129101 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143654 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143749 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143783 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143825 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143854 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143888 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143943 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.143994 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144024 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144051 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144082 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144113 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144145 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144263 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144293 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144325 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144385 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144445 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144474 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144505 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144535 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144621 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144654 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144683 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144712 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144779 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144808 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144839 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144872 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144907 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144067 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144937 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144109 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144278 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144800 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144899 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.144969 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145030 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145105 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145136 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145164 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145193 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145223 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145239 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145253 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145365 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145383 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145417 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145444 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145452 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145658 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145822 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146054 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146176 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146295 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146638 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.145463 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146701 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146731 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146756 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146780 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146805 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146828 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146850 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146871 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146930 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146963 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146987 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.146985 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147013 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147089 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147111 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147112 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147128 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147150 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147158 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147170 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147187 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147205 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147224 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147240 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147258 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147273 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147289 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147337 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147353 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147368 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147382 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147398 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147413 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147428 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147443 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147460 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147478 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147496 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147511 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147541 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147604 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147621 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147638 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147652 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147669 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147688 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147705 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147721 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147736 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147753 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147773 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147788 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147834 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147866 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147881 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147895 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147910 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147925 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147939 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147956 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147972 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147987 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148004 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148021 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148052 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148085 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148133 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148149 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148165 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148182 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148271 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148288 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148305 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148338 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148388 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148405 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148421 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148438 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148454 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148470 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148486 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148503 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148521 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148537 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148554 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148570 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149144 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149164 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149179 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149196 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149233 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149269 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149286 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149302 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149318 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149334 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149350 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149371 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149425 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149444 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149479 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149542 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149562 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149581 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149613 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149631 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149697 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149715 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149733 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149752 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149770 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149787 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149819 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149835 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149869 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149885 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149901 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149918 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149937 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149956 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149972 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149988 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150004 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150019 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150053 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150068 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150150 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150171 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150210 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150231 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150268 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150305 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150342 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150360 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150430 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150442 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150453 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150463 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150473 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150483 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150493 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150502 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150513 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150524 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150536 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150549 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150562 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150575 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150605 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150616 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150626 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150637 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150646 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150656 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150668 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150677 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154245 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.156942 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157745 4962 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158886 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.163420 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147424 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147371 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147639 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.147656 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148045 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148122 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148332 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148405 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148416 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148505 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148540 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148708 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.148762 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149042 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.149061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150275 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150638 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150800 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.150859 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151108 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151164 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151247 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.151818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152087 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152425 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152435 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152784 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.152972 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.153083 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.153294 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.153451 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.153889 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154146 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154440 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154671 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154718 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154766 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154885 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.154909 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155000 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155041 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155063 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155687 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155834 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.155999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.156434 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.156713 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.156723 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.156808 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157105 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157447 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157483 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157492 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157552 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.157550 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158487 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158585 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158633 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.158649 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159148 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159153 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159241 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159453 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.159862 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.160101 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.160569 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.160696 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162322 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162516 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162515 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162544 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162560 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.162813 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.163021 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.165851 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.166754 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.167323 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.167338 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.167795 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.168051 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.168279 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.168806 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.171946 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.171994 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.172082 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.172253 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.172282 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.172675 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.173897 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.174873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175041 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175577 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175759 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.175739 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176122 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176199 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176574 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176755 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176993 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177001 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.176977 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177071 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177349 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177656 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177709 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.177742 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178252 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.178360 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.665669704 +0000 UTC m=+20.248141560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.178539 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.678393878 +0000 UTC m=+20.260865724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178763 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178817 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178826 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.178935 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.179031 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.179114 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.679097101 +0000 UTC m=+20.261568947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.179577 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.179884 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180074 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180094 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180197 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180377 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180410 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180562 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180458 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.180934 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.181358 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.183663 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.183848 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.183877 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.183980 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.683958535 +0000 UTC m=+20.266430381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.184253 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.184750 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.185174 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.185392 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.185897 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.185904 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.187765 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.189691 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.187833 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.189728 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.190891 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.191307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.192441 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.192478 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.192495 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.192563 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:28.692541388 +0000 UTC m=+20.275013454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.193245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.193372 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194066 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194417 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194555 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194758 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.194914 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.195365 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.196045 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.196226 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.197063 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.197645 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.197774 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.199745 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.199857 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.200262 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.200757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.201374 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.201438 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.201569 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.201871 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.202666 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.202927 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.203060 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.203069 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.203303 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.203950 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205293 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205279 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205579 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.205783 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.208586 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.219050 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.224837 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.238455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.238945 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251188 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251240 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251295 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251325 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251335 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251346 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251355 4962 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251367 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251400 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251418 4962 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251427 4962 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251436 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251446 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251455 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251482 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251501 4962 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251510 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251520 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251529 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251553 4962 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251563 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251572 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251582 4962 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251774 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251914 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251928 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.251937 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252116 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252139 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252151 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252162 4962 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252189 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252183 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252199 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252300 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252314 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252340 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252352 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252364 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252378 4962 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252390 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252401 4962 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252416 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252427 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252438 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252451 4962 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252462 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252473 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252485 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252496 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252509 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252520 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252531 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252542 4962 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252553 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252565 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252577 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252609 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252621 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252631 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252643 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252655 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252673 4962 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252685 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252697 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252709 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252720 4962 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252732 4962 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252743 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252754 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252764 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252775 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252786 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252797 4962 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252808 4962 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252817 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252826 4962 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252835 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252844 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252855 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252865 4962 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252873 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252881 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252889 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252896 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252904 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252913 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252921 4962 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252928 4962 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252936 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252944 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252952 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252960 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252968 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252979 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252987 4962 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.252995 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253004 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253013 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253023 4962 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253031 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253039 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253049 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253058 4962 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253066 4962 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253074 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253082 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253090 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253098 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253107 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253115 4962 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253123 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253131 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253139 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253148 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253158 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253170 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253180 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253193 4962 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253204 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253214 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253224 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253235 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253246 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253258 4962 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253267 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253277 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253290 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253301 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253313 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253325 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253336 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253348 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253357 4962 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253366 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253374 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.253384 4962 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254324 4962 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254342 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254361 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254372 4962 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254397 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254408 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254417 4962 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254425 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254433 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254441 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254449 4962 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254456 4962 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254465 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254473 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254481 4962 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254489 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254497 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254509 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254517 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254525 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254534 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254542 4962 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254551 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254559 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254568 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254577 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254585 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254615 4962 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254625 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254632 4962 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254641 4962 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254649 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254657 4962 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254665 4962 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254672 4962 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254680 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254690 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254702 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.254711 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.267338 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed" exitCode=255 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.267397 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed"} Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.280069 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.291536 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.302433 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.312487 4962 scope.go:117] "RemoveContainer" containerID="e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.312859 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.315469 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.331098 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.347497 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.369890 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.382101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.384135 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e93f6c4f31e43247e8c3c1427ad99fc52f7d208ff6834da1bbb7d899f683b6a7 WatchSource:0}: Error finding container e93f6c4f31e43247e8c3c1427ad99fc52f7d208ff6834da1bbb7d899f683b6a7: Status 404 returned error can't find the container with id e93f6c4f31e43247e8c3c1427ad99fc52f7d208ff6834da1bbb7d899f683b6a7 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.391633 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.409794 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-5fe2678ed1ae6f7a6aaeac8ebf1de18ac37c401a8d715114787b6e3efaf3edc7 WatchSource:0}: Error finding container 5fe2678ed1ae6f7a6aaeac8ebf1de18ac37c401a8d715114787b6e3efaf3edc7: Status 404 returned error can't find the container with id 5fe2678ed1ae6f7a6aaeac8ebf1de18ac37c401a8d715114787b6e3efaf3edc7 Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.411692 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-959b3593c1218d1c3b5846f7057bf9a4e00402d70673ddcae7eb952d6cc2b7a7 WatchSource:0}: Error finding container 959b3593c1218d1c3b5846f7057bf9a4e00402d70673ddcae7eb952d6cc2b7a7: Status 404 returned error can't find the container with id 959b3593c1218d1c3b5846f7057bf9a4e00402d70673ddcae7eb952d6cc2b7a7 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.523922 4962 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.758920 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.758985 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.759007 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.759030 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.759048 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759158 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759173 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759184 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759223 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759210381 +0000 UTC m=+21.341682227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759267 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759262453 +0000 UTC m=+21.341734299 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759295 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759322 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759317514 +0000 UTC m=+21.341789360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759358 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759376 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759370736 +0000 UTC m=+21.341842582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759410 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759419 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759426 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.759443 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:29.759437948 +0000 UTC m=+21.341909784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:28 crc kubenswrapper[4962]: I0220 09:55:28.883918 4962 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884107 4962 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884133 4962 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884154 4962 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884169 4962 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884184 4962 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884199 4962 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: E0220 09:55:28.884248 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": read tcp 38.102.83.103:38906->38.102.83.103:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1895ebd09c196677 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 09:55:09.644793463 +0000 UTC m=+1.227265349,LastTimestamp:2026-02-20 09:55:09.644793463 +0000 UTC m=+1.227265349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884356 4962 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884377 4962 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884394 4962 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884409 4962 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884424 4962 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884439 4962 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:28 crc kubenswrapper[4962]: W0220 09:55:28.884959 4962 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.051798 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-20 09:50:28 +0000 UTC, rotation deadline is 2026-12-04 18:42:22.334820915 +0000 UTC Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.052098 4962 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6896h46m53.282727943s for next certificate rotation Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.063035 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 09:44:32.954128211 +0000 UTC Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.143625 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.145053 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.147557 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.149014 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.151259 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.152484 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.153912 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.155764 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.156301 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.157323 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.159415 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.160279 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.161019 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.161539 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.162087 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.163888 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.164811 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.170817 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.171437 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.172923 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.173884 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.174534 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.175955 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.176552 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.179996 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.180964 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.183282 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.184865 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.186852 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.188041 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.193462 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.194470 4962 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.194708 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.199553 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.200297 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.200543 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.200929 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.203576 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.204838 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.206132 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.207002 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.208405 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.209066 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.210287 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.211088 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.212090 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.212586 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.213780 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.214324 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.215477 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.215987 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.217011 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.217642 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.218137 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.218695 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.219530 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.222810 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.244466 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.272370 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.273334 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.274238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.274946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.276256 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.276278 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.276293 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"959b3593c1218d1c3b5846f7057bf9a4e00402d70673ddcae7eb952d6cc2b7a7"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.277456 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5fe2678ed1ae6f7a6aaeac8ebf1de18ac37c401a8d715114787b6e3efaf3edc7"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.278775 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.278818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e93f6c4f31e43247e8c3c1427ad99fc52f7d208ff6834da1bbb7d899f683b6a7"} Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.299361 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.320693 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.340499 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.356785 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.374020 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.389156 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.401817 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.413840 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.430243 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.553501 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-s8xxr"] Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.553789 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.555216 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.555828 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.556821 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.579986 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.597438 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.620934 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.647890 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.665737 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.671304 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a431054f-57c5-41b7-93b2-2d2fbf9949ce-hosts-file\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.671340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9fz6\" (UniqueName: \"kubernetes.io/projected/a431054f-57c5-41b7-93b2-2d2fbf9949ce-kube-api-access-p9fz6\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.703171 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.726476 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.742430 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.771871 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.771965 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772073 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772098 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a431054f-57c5-41b7-93b2-2d2fbf9949ce-hosts-file\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9fz6\" (UniqueName: \"kubernetes.io/projected/a431054f-57c5-41b7-93b2-2d2fbf9949ce-kube-api-access-p9fz6\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772140 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772268 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772322 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.772306761 +0000 UTC m=+23.354778607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772384 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.772376603 +0000 UTC m=+23.354848449 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772453 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772471 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772483 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772508 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.772501117 +0000 UTC m=+23.354972963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772563 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772573 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772581 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772630 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.77262186 +0000 UTC m=+23.355093706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.772679 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a431054f-57c5-41b7-93b2-2d2fbf9949ce-hosts-file\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.772869 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: E0220 09:55:29.773006 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:31.772980762 +0000 UTC m=+23.355452608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.789642 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9fz6\" (UniqueName: \"kubernetes.io/projected/a431054f-57c5-41b7-93b2-2d2fbf9949ce-kube-api-access-p9fz6\") pod \"node-resolver-s8xxr\" (UID: \"a431054f-57c5-41b7-93b2-2d2fbf9949ce\") " pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.800583 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.803058 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.864514 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s8xxr" Feb 20 09:55:29 crc kubenswrapper[4962]: W0220 09:55:29.945058 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda431054f_57c5_41b7_93b2_2d2fbf9949ce.slice/crio-5def5c7723be7246f59094cb5773b98bc3c5d5cefc8592a4e372fc13611b133a WatchSource:0}: Error finding container 5def5c7723be7246f59094cb5773b98bc3c5d5cefc8592a4e372fc13611b133a: Status 404 returned error can't find the container with id 5def5c7723be7246f59094cb5773b98bc3c5d5cefc8592a4e372fc13611b133a Feb 20 09:55:29 crc kubenswrapper[4962]: I0220 09:55:29.972030 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.049982 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.064312 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 20:02:54.255647214 +0000 UTC Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.099921 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.138713 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.138827 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:30 crc kubenswrapper[4962]: E0220 09:55:30.138932 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.138852 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:30 crc kubenswrapper[4962]: E0220 09:55:30.139098 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:30 crc kubenswrapper[4962]: E0220 09:55:30.139275 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.213447 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.253494 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.283330 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s8xxr" event={"ID":"a431054f-57c5-41b7-93b2-2d2fbf9949ce","Type":"ContainerStarted","Data":"15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf"} Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.283395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s8xxr" event={"ID":"a431054f-57c5-41b7-93b2-2d2fbf9949ce","Type":"ContainerStarted","Data":"5def5c7723be7246f59094cb5773b98bc3c5d5cefc8592a4e372fc13611b133a"} Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.295179 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.299611 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.315388 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.366916 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.423066 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.432522 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.448507 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wqwgj"] Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.449004 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.451820 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.452730 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.452935 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.458955 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.459531 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.459718 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7hj8w"] Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.459820 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.460283 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.462158 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-99b2s"] Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.462757 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.472383 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.472447 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.472929 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.473038 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-m9d46"] Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.474241 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.484795 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.484865 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.485513 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.485714 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.485928 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.486064 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.486190 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.486313 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.486471 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.488862 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.493186 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.510953 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.527438 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.536860 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.550651 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.568640 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578177 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578304 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578348 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578392 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-bin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-os-release\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578462 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66n7s\" (UniqueName: \"kubernetes.io/projected/ef72d73c-d177-4436-b681-83866e1f6d12-kube-api-access-66n7s\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578512 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578580 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-socket-dir-parent\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578624 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-cnibin\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578672 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578729 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-os-release\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578758 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-daemon-config\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578803 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-multus-certs\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578934 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.578976 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579007 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579033 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579054 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579074 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-multus\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579094 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/751d5e0b-919c-4777-8475-ed7214f7647f-proxy-tls\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579135 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579154 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-system-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579225 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579414 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-hostroot\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579457 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzq9p\" (UniqueName: \"kubernetes.io/projected/751d5e0b-919c-4777-8475-ed7214f7647f-kube-api-access-rzq9p\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579484 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-system-cni-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579519 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-kubelet\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579543 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cnibin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.579654 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cni-binary-copy\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580001 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580218 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580259 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxzh\" (UniqueName: \"kubernetes.io/projected/1957ac70-30f9-48c2-a82b-72aa3b7a883a-kube-api-access-fwxzh\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580280 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/751d5e0b-919c-4777-8475-ed7214f7647f-mcd-auth-proxy-config\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580351 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-conf-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-etc-kubernetes\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580391 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-k8s-cni-cncf-io\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580415 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-netns\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580458 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.580485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/751d5e0b-919c-4777-8475-ed7214f7647f-rootfs\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.585737 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.598819 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.611482 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.623030 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.638213 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.652159 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.663838 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681191 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxzh\" (UniqueName: \"kubernetes.io/projected/1957ac70-30f9-48c2-a82b-72aa3b7a883a-kube-api-access-fwxzh\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/751d5e0b-919c-4777-8475-ed7214f7647f-mcd-auth-proxy-config\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681275 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681305 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-etc-kubernetes\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681389 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-conf-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681351 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-etc-kubernetes\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681358 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-conf-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681437 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681460 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-k8s-cni-cncf-io\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681483 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-netns\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681518 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/751d5e0b-919c-4777-8475-ed7214f7647f-rootfs\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681561 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681538 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681618 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-k8s-cni-cncf-io\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681630 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681650 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-netns\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681660 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681682 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/751d5e0b-919c-4777-8475-ed7214f7647f-rootfs\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681702 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681722 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681728 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-bin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681752 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-os-release\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66n7s\" (UniqueName: \"kubernetes.io/projected/ef72d73c-d177-4436-b681-83866e1f6d12-kube-api-access-66n7s\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681800 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681822 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681844 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-os-release\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681867 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-socket-dir-parent\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681891 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-cnibin\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681937 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-daemon-config\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.681982 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-multus-certs\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682003 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682038 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682060 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682106 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-multus\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682148 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/751d5e0b-919c-4777-8475-ed7214f7647f-proxy-tls\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682209 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-system-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682262 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682283 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682305 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-hostroot\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682327 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzq9p\" (UniqueName: \"kubernetes.io/projected/751d5e0b-919c-4777-8475-ed7214f7647f-kube-api-access-rzq9p\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682351 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-system-cni-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682373 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cnibin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-kubelet\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682416 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682454 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cni-binary-copy\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682488 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682537 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682567 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-bin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/751d5e0b-919c-4777-8475-ed7214f7647f-mcd-auth-proxy-config\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682736 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682791 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682795 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-os-release\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682832 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682855 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-socket-dir-parent\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682868 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682875 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-os-release\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-system-cni-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682908 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-cni-multus\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682964 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.682978 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683029 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-hostroot\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683087 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683062 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683108 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-cnibin\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-var-lib-kubelet\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683138 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683146 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cnibin\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683172 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-host-run-multus-certs\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683212 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1957ac70-30f9-48c2-a82b-72aa3b7a883a-system-cni-dir\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683668 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-cni-binary-copy\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.683808 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef72d73c-d177-4436-b681-83866e1f6d12-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.684026 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1957ac70-30f9-48c2-a82b-72aa3b7a883a-multus-daemon-config\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.684258 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef72d73c-d177-4436-b681-83866e1f6d12-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.687260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.687901 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/751d5e0b-919c-4777-8475-ed7214f7647f-proxy-tls\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.699225 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66n7s\" (UniqueName: \"kubernetes.io/projected/ef72d73c-d177-4436-b681-83866e1f6d12-kube-api-access-66n7s\") pod \"multus-additional-cni-plugins-7hj8w\" (UID: \"ef72d73c-d177-4436-b681-83866e1f6d12\") " pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.701742 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.702292 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzq9p\" (UniqueName: \"kubernetes.io/projected/751d5e0b-919c-4777-8475-ed7214f7647f-kube-api-access-rzq9p\") pod \"machine-config-daemon-m9d46\" (UID: \"751d5e0b-919c-4777-8475-ed7214f7647f\") " pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.708068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxzh\" (UniqueName: \"kubernetes.io/projected/1957ac70-30f9-48c2-a82b-72aa3b7a883a-kube-api-access-fwxzh\") pod \"multus-wqwgj\" (UID: \"1957ac70-30f9-48c2-a82b-72aa3b7a883a\") " pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.709981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") pod \"ovnkube-node-99b2s\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.714881 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:30Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.762117 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wqwgj" Feb 20 09:55:30 crc kubenswrapper[4962]: W0220 09:55:30.776564 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1957ac70_30f9_48c2_a82b_72aa3b7a883a.slice/crio-00dee5bbbf4937108f466d79c3bce6f860c5abc7f98c930473f3b307ff612067 WatchSource:0}: Error finding container 00dee5bbbf4937108f466d79c3bce6f860c5abc7f98c930473f3b307ff612067: Status 404 returned error can't find the container with id 00dee5bbbf4937108f466d79c3bce6f860c5abc7f98c930473f3b307ff612067 Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.783459 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.796482 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:30 crc kubenswrapper[4962]: W0220 09:55:30.803920 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef72d73c_d177_4436_b681_83866e1f6d12.slice/crio-909b13a1fc9afef6758fb5cfcc8fa0fbd00d1703a6c148635bd41f628e163bc9 WatchSource:0}: Error finding container 909b13a1fc9afef6758fb5cfcc8fa0fbd00d1703a6c148635bd41f628e163bc9: Status 404 returned error can't find the container with id 909b13a1fc9afef6758fb5cfcc8fa0fbd00d1703a6c148635bd41f628e163bc9 Feb 20 09:55:30 crc kubenswrapper[4962]: I0220 09:55:30.806450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:55:30 crc kubenswrapper[4962]: W0220 09:55:30.829133 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2abd2b70_bb78_49a0_b930_cd066384e803.slice/crio-30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7 WatchSource:0}: Error finding container 30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7: Status 404 returned error can't find the container with id 30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7 Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.064786 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 12:07:52.982524225 +0000 UTC Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.288920 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.290027 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" exitCode=0 Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.290080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.290097 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.293197 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.293245 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.293259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"46cb7bc029282997298cb9150b87b2ce8241d6d6c942b7c31acc89474cb54917"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.295317 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966" exitCode=0 Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.295396 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.295437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerStarted","Data":"909b13a1fc9afef6758fb5cfcc8fa0fbd00d1703a6c148635bd41f628e163bc9"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.296939 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.296981 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"00dee5bbbf4937108f466d79c3bce6f860c5abc7f98c930473f3b307ff612067"} Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.307336 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.321057 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.337360 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.355891 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.386936 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.406877 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.420354 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.438165 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.450662 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.471168 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.483498 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.499395 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.524406 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.539979 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.558906 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.573241 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.585373 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.599039 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.615301 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.630759 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.643943 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.658627 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.677096 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.688918 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:31Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789505 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789656 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789669 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.789648239 +0000 UTC m=+27.372120095 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789698 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789741 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:31 crc kubenswrapper[4962]: I0220 09:55:31.789773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789781 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789823 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789837 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789864 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789877 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789887 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789891 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.789880896 +0000 UTC m=+27.372352752 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789914 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.789905367 +0000 UTC m=+27.372377213 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.789939 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.790024 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.79000595 +0000 UTC m=+27.372477796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.790081 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:31 crc kubenswrapper[4962]: E0220 09:55:31.790113 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:35.790106173 +0000 UTC m=+27.372578089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.065296 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 13:46:41.710907399 +0000 UTC Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.138653 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.138671 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.138837 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:32 crc kubenswrapper[4962]: E0220 09:55:32.138886 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:32 crc kubenswrapper[4962]: E0220 09:55:32.138983 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:32 crc kubenswrapper[4962]: E0220 09:55:32.139051 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.305853 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.305907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.305920 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.305931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.308034 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5" exitCode=0 Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.308141 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5"} Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.329887 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.355751 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.372670 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.390539 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.412366 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.430300 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.432053 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.448392 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.448452 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.448702 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.466358 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.523203 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.563846 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.585569 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.603510 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.625494 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.646375 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.663287 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.677462 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.693268 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.705936 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.706084 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hxb97"] Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.706814 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.708669 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.709109 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.709202 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.709383 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.720559 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.734309 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.746011 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.755861 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.768585 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.784723 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.795557 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.811633 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.828215 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.842391 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.866326 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.887896 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.902499 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f0e53ce-e004-473e-be85-ef4c83e399c7-host\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.902559 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f0e53ce-e004-473e-be85-ef4c83e399c7-serviceca\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.902610 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27br\" (UniqueName: \"kubernetes.io/projected/4f0e53ce-e004-473e-be85-ef4c83e399c7-kube-api-access-c27br\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.903092 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.923571 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.937656 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.955845 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.967040 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.977372 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:32 crc kubenswrapper[4962]: I0220 09:55:32.994069 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:32Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.004118 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c27br\" (UniqueName: \"kubernetes.io/projected/4f0e53ce-e004-473e-be85-ef4c83e399c7-kube-api-access-c27br\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.004199 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f0e53ce-e004-473e-be85-ef4c83e399c7-host\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.004241 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f0e53ce-e004-473e-be85-ef4c83e399c7-serviceca\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.004379 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f0e53ce-e004-473e-be85-ef4c83e399c7-host\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.005341 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f0e53ce-e004-473e-be85-ef4c83e399c7-serviceca\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.009136 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.024068 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.024214 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27br\" (UniqueName: \"kubernetes.io/projected/4f0e53ce-e004-473e-be85-ef4c83e399c7-kube-api-access-c27br\") pod \"node-ca-hxb97\" (UID: \"4f0e53ce-e004-473e-be85-ef4c83e399c7\") " pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.066285 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 19:32:12.387745276 +0000 UTC Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.315563 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.315673 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.318224 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f" exitCode=0 Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.318300 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f"} Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.320024 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hxb97" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.335768 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.351842 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.361464 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.377729 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.399371 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.414195 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.429352 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.450960 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.474355 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.491998 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.504524 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.516887 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.529735 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.543962 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:33Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:33 crc kubenswrapper[4962]: I0220 09:55:33.595889 4962 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.067183 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 21:14:40.57364124 +0000 UTC Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.137922 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.138080 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.138144 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.138290 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.138378 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.138619 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.323067 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211" exitCode=0 Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.323167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.324149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hxb97" event={"ID":"4f0e53ce-e004-473e-be85-ef4c83e399c7","Type":"ContainerStarted","Data":"e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.324193 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hxb97" event={"ID":"4f0e53ce-e004-473e-be85-ef4c83e399c7","Type":"ContainerStarted","Data":"9c3e86a25b4a96a9c1728f4ceba5d23082e5c15f8fc65a3b8b77a765b4d7d893"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.344739 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.365007 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.378665 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.389209 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.401379 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.414138 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.432500 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.436688 4962 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.441953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.441995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.442007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.442345 4962 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.448938 4962 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.449267 4962 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.451899 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.452213 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.465222 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.470867 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.475462 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.477125 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.489202 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.494027 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.499582 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.505294 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.516391 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.521327 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.523533 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.542050 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.543112 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.545957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.545984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.545992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.546007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.546019 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.554946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.558215 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.558352 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: E0220 09:55:34.558495 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560519 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.560860 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.566293 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.573399 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.584687 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.604573 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.661146 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.663235 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.677049 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.694193 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.711181 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.732557 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.747875 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.765699 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.771363 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.783958 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.797787 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.811039 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.825691 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.840004 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.856424 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.868521 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.872048 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.886443 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.908129 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.937321 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.960319 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.971704 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:34Z","lastTransitionTime":"2026-02-20T09:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.975736 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.987177 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:34 crc kubenswrapper[4962]: I0220 09:55:34.996509 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:34Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.006530 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.020248 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.038207 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.054271 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.067522 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 22:58:19.125569499 +0000 UTC Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075817 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.075855 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.179476 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284921 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.284986 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.335833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.339786 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84" exitCode=0 Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.339880 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84"} Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.349416 4962 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.366994 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.388387 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.392178 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.425527 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.466124 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.483504 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.491428 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.505241 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.527240 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.543650 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.562209 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.577627 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.594503 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595282 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.595318 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.613860 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.629895 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.652268 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.668284 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:35Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.697549 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799683 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.799723 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.873767 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.873923 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.873897628 +0000 UTC m=+35.456369474 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.873973 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.874020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.874055 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.874088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874148 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874179 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874185 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874223 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874235 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874235 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874247 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874249 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874239 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.874211818 +0000 UTC m=+35.456683674 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874324 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.874316701 +0000 UTC m=+35.456788547 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874334 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.874329522 +0000 UTC m=+35.456801368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:35 crc kubenswrapper[4962]: E0220 09:55:35.874353 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:43.874340182 +0000 UTC m=+35.456812028 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:35 crc kubenswrapper[4962]: I0220 09:55:35.902340 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:35Z","lastTransitionTime":"2026-02-20T09:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005101 4962 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.005731 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.068670 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 22:56:50.834110646 +0000 UTC Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109499 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.109637 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.138938 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.139005 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.139018 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:36 crc kubenswrapper[4962]: E0220 09:55:36.139158 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:36 crc kubenswrapper[4962]: E0220 09:55:36.139336 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:36 crc kubenswrapper[4962]: E0220 09:55:36.139528 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.194380 4962 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.212876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.213203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.213335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.213521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.213695 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.316518 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.316849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.316981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.317119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.317732 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.350643 4962 generic.go:334] "Generic (PLEG): container finished" podID="ef72d73c-d177-4436-b681-83866e1f6d12" containerID="db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2" exitCode=0 Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.350784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerDied","Data":"db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.370773 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.386564 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.404782 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.419379 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.421868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.421985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.421999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.422023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.422036 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.432342 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.450716 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.470010 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.486935 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.501054 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.519671 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524920 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.524940 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.541168 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.560547 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.582719 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.598238 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.613494 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.627289 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.730537 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.835707 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939403 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:36 crc kubenswrapper[4962]: I0220 09:55:36.939459 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:36Z","lastTransitionTime":"2026-02-20T09:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.042857 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.068863 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 17:00:57.535686837 +0000 UTC Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.145259 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.248174 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352215 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.352274 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.360530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.360833 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.360887 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.367904 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" event={"ID":"ef72d73c-d177-4436-b681-83866e1f6d12","Type":"ContainerStarted","Data":"dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.386681 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.401438 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.406949 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.413356 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.437987 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.454274 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455547 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.455571 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.472176 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.487914 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.510681 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.535966 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.552838 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.558372 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.568311 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.586570 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.600340 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.622741 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.638280 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.657522 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.661404 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.680519 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.712025 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.740650 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.765207 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.793616 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.816475 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.830614 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.843871 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.858207 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.867890 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.872278 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.883781 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.895469 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.908797 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.922409 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.944682 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.955904 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:37Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:37 crc kubenswrapper[4962]: I0220 09:55:37.972826 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:37Z","lastTransitionTime":"2026-02-20T09:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.069677 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 23:36:11.659867331 +0000 UTC Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.075989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.076041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.076054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.076077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.076096 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.138512 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:38 crc kubenswrapper[4962]: E0220 09:55:38.138842 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.139714 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:38 crc kubenswrapper[4962]: E0220 09:55:38.139863 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.139963 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:38 crc kubenswrapper[4962]: E0220 09:55:38.140070 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180254 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.180284 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.299247 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.380100 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.402344 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.505246 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.608255 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.711249 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.813831 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:38 crc kubenswrapper[4962]: I0220 09:55:38.917155 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:38Z","lastTransitionTime":"2026-02-20T09:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.020532 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.070208 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 07:18:38.889332968 +0000 UTC Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123624 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.123713 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.160534 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.190907 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.219212 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.226967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.227078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.227107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.227206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.227304 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.240097 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.262015 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.284183 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.306273 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.326571 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.330815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.330884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.330906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.331104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.331138 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.343440 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.357519 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.373113 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.383264 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.396272 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.416536 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434521 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.434567 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.445845 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.468169 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.537765 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640934 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.640946 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.743773 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.846368 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950062 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:39 crc kubenswrapper[4962]: I0220 09:55:39.950106 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:39Z","lastTransitionTime":"2026-02-20T09:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.012331 4962 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.053245 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.071157 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 12:43:56.344870697 +0000 UTC Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.138227 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.138303 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.138240 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:40 crc kubenswrapper[4962]: E0220 09:55:40.138408 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:40 crc kubenswrapper[4962]: E0220 09:55:40.138733 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:40 crc kubenswrapper[4962]: E0220 09:55:40.138585 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.155769 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259438 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.259478 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.363216 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.467379 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.571470 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674647 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.674795 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779454 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.779482 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.882819 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:40 crc kubenswrapper[4962]: I0220 09:55:40.986986 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:40Z","lastTransitionTime":"2026-02-20T09:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.071653 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 19:26:48.70432122 +0000 UTC Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091126 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.091172 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.108486 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.136842 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.163184 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.185439 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.193709 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.203425 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.225033 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.247321 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.280790 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.296958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.297028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.297050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.297080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.297099 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.307428 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.324959 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.338696 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.362408 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.385806 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.397035 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/0.log" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.399644 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.401657 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f" exitCode=1 Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.401699 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.402511 4962 scope.go:117] "RemoveContainer" containerID="1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.419584 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.442303 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.461113 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.485204 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.503867 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.511745 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.542467 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.561470 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.577071 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.593128 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.605584 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.606305 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.616423 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.627865 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.637250 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.647195 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.660259 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.671823 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.684946 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.699677 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:41Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.709889 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.812903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.812973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.812994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.813022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.813042 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:41 crc kubenswrapper[4962]: I0220 09:55:41.916280 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:41Z","lastTransitionTime":"2026-02-20T09:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.019554 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.072891 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 22:38:13.928021397 +0000 UTC Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.121985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.122042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.122056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.122078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.122119 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.138691 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.138762 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:42 crc kubenswrapper[4962]: E0220 09:55:42.138807 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.138691 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:42 crc kubenswrapper[4962]: E0220 09:55:42.139003 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:42 crc kubenswrapper[4962]: E0220 09:55:42.138922 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225195 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.225291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328088 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.328184 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.409613 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/0.log" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.412633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.412799 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.429379 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431859 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.431885 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.445472 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.468905 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.489972 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.504546 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.518071 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.531909 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.535282 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.553151 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.570087 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.583101 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.599079 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.611863 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.624160 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.637775 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.644236 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.655981 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740657 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.740730 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.829443 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf"] Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.830101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.832756 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.833675 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843666 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx5m2\" (UniqueName: \"kubernetes.io/projected/8526746c-450b-4df8-8ea1-f0cbabd13894-kube-api-access-tx5m2\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843729 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8526746c-450b-4df8-8ea1-f0cbabd13894-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843769 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.844173 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.843926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-env-overrides\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.853577 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.870437 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.895561 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.932707 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.945101 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx5m2\" (UniqueName: \"kubernetes.io/projected/8526746c-450b-4df8-8ea1-f0cbabd13894-kube-api-access-tx5m2\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.945161 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8526746c-450b-4df8-8ea1-f0cbabd13894-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.945194 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.945229 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-env-overrides\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.946153 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-env-overrides\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.946329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8526746c-450b-4df8-8ea1-f0cbabd13894-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.947259 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:42Z","lastTransitionTime":"2026-02-20T09:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.953261 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8526746c-450b-4df8-8ea1-f0cbabd13894-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.956340 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.967761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx5m2\" (UniqueName: \"kubernetes.io/projected/8526746c-450b-4df8-8ea1-f0cbabd13894-kube-api-access-tx5m2\") pod \"ovnkube-control-plane-749d76644c-htkbf\" (UID: \"8526746c-450b-4df8-8ea1-f0cbabd13894\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.970554 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:42 crc kubenswrapper[4962]: I0220 09:55:42.988437 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:42Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.008344 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.022390 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.039969 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.051402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.051477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.051500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.051933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.052190 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.059763 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.073116 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 05:28:54.14387113 +0000 UTC Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.077407 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.098091 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.120019 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.142070 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.147290 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155534 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.155575 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.161685 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: W0220 09:55:43.168710 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8526746c_450b_4df8_8ea1_f0cbabd13894.slice/crio-1f73fe569d4cc78f08fa0abc2c32bc37e40f133cd0b63236d631808fa9e455ae WatchSource:0}: Error finding container 1f73fe569d4cc78f08fa0abc2c32bc37e40f133cd0b63236d631808fa9e455ae: Status 404 returned error can't find the container with id 1f73fe569d4cc78f08fa0abc2c32bc37e40f133cd0b63236d631808fa9e455ae Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.258776 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362169 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362197 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.362253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.419166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" event={"ID":"8526746c-450b-4df8-8ea1-f0cbabd13894","Type":"ContainerStarted","Data":"1f73fe569d4cc78f08fa0abc2c32bc37e40f133cd0b63236d631808fa9e455ae"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.422133 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/1.log" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.423171 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/0.log" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.428483 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" exitCode=1 Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.428543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.428648 4962 scope.go:117] "RemoveContainer" containerID="1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.429541 4962 scope.go:117] "RemoveContainer" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.429768 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.452124 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.465935 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.466003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.466024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.466054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.466075 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.468785 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.488145 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.505789 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.525663 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.550967 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.568526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.568730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.568831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.568922 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.569003 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.573652 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.595666 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.618012 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.641407 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.660301 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.673190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.673286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.673311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.673828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.674066 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.676195 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.704506 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.722269 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.744980 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.767841 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.776890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.776954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.776973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.777005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.777027 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.880134 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.948742 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5bwk2"] Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.949286 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.949366 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.956523 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.956812 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.956764008 +0000 UTC m=+51.539235894 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.956911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957031 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjn55\" (UniqueName: \"kubernetes.io/projected/d590527b-ed56-4fb4-a712-b09781618a76-kube-api-access-jjn55\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957082 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957196 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.95716031 +0000 UTC m=+51.539632186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957086 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957311 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957529 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957539 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.957497831 +0000 UTC m=+51.539969867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957557 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957583 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957666 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.957735 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.957714468 +0000 UTC m=+51.540186324 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.957812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.958011 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.958037 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.958053 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:43 crc kubenswrapper[4962]: E0220 09:55:43.958151 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.95809943 +0000 UTC m=+51.540571296 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.964319 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.983847 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:43Z","lastTransitionTime":"2026-02-20T09:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:43 crc kubenswrapper[4962]: I0220 09:55:43.985419 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:43Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.019985 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.038501 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.058497 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.058583 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjn55\" (UniqueName: \"kubernetes.io/projected/d590527b-ed56-4fb4-a712-b09781618a76-kube-api-access-jjn55\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.058745 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.058851 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:44.558824749 +0000 UTC m=+36.141296595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.064571 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.074059 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 05:26:37.704482081 +0000 UTC Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.077773 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjn55\" (UniqueName: \"kubernetes.io/projected/d590527b-ed56-4fb4-a712-b09781618a76-kube-api-access-jjn55\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.084562 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.086638 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.104393 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.117046 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.133763 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.138361 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.138556 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.138397 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.138699 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.138373 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.138780 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.150230 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.164687 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.177629 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189076 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.189122 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.193922 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.207311 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.226065 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.246936 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.259241 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292956 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.292997 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396096 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.396249 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.436992 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/1.log" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.444877 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" event={"ID":"8526746c-450b-4df8-8ea1-f0cbabd13894","Type":"ContainerStarted","Data":"84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.444960 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" event={"ID":"8526746c-450b-4df8-8ea1-f0cbabd13894","Type":"ContainerStarted","Data":"c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.469671 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.492988 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498817 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.498827 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.511886 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.534181 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.552657 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.563750 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.564060 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.564241 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:45.564204846 +0000 UTC m=+37.146676732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.579878 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.587309 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.606849 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.607446 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611862 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.611883 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.625378 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.632861 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.637771 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.641653 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.661254 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.671434 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.676964 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.692422 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.703259 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.705584 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1321cb168512a793cb7cee1cd5a9e56cc5d428f8156a690cb6144a8cd78f9b9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:40Z\\\",\\\"message\\\":\\\"55:40.017718 6286 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.017554 6286 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 09:55:40.017800 6286 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.017620 6286 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 09:55:40.018093 6286 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018544 6286 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018615 6286 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.018793 6286 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:55:40.019127 6286 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.719446 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: E0220 09:55:44.719632 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.721969 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.729054 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.748803 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.762442 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.775129 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.793871 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.806366 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:44Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.824450 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927910 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:44 crc kubenswrapper[4962]: I0220 09:55:44.927979 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:44Z","lastTransitionTime":"2026-02-20T09:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030518 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.030609 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.074590 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 16:17:30.384275256 +0000 UTC Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.135250 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238554 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.238640 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341633 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341668 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.341678 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.446378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.446510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.446530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.447201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.447343 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.551725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.552664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.552690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.552708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.552718 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.573652 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:45 crc kubenswrapper[4962]: E0220 09:55:45.573900 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:45 crc kubenswrapper[4962]: E0220 09:55:45.573976 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:47.573953518 +0000 UTC m=+39.156425394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.657309 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.760778 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.864754 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:45 crc kubenswrapper[4962]: I0220 09:55:45.967680 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:45Z","lastTransitionTime":"2026-02-20T09:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069639 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069657 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.069699 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.075703 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 14:41:32.421434868 +0000 UTC Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.137798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.137815 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.137834 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.137923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:46 crc kubenswrapper[4962]: E0220 09:55:46.138569 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:46 crc kubenswrapper[4962]: E0220 09:55:46.138390 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:46 crc kubenswrapper[4962]: E0220 09:55:46.138200 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:46 crc kubenswrapper[4962]: E0220 09:55:46.138883 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.178744 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.282433 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.386858 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.386932 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.386953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.386982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.387002 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.490543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.490919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.491122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.491288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.491481 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595410 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.595480 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.699312 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.802864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.802959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.802983 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.803026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.803052 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907276 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907309 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:46 crc kubenswrapper[4962]: I0220 09:55:46.907328 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:46Z","lastTransitionTime":"2026-02-20T09:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011648 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.011797 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.076118 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 18:52:26.030689126 +0000 UTC Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.115291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.219409 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322440 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322463 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.322483 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.404556 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.406011 4962 scope.go:117] "RemoveContainer" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" Feb 20 09:55:47 crc kubenswrapper[4962]: E0220 09:55:47.406299 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425731 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.425950 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.439712 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.458397 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.481363 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.503641 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529121 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529074 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.529253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.545485 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.563433 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.600003 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.600259 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:47 crc kubenswrapper[4962]: E0220 09:55:47.600464 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:47 crc kubenswrapper[4962]: E0220 09:55:47.600566 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:51.6005354 +0000 UTC m=+43.183007276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.624895 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.633206 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.650397 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.674333 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.694215 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.715436 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.736423 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.740684 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.771952 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.784975 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:47Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.840903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.840967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.840985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.841037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.841058 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943840 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:47 crc kubenswrapper[4962]: I0220 09:55:47.943929 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:47Z","lastTransitionTime":"2026-02-20T09:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046827 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.046933 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.077165 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 21:40:54.047399832 +0000 UTC Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.138791 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.138828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:48 crc kubenswrapper[4962]: E0220 09:55:48.139036 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.139078 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.139385 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:48 crc kubenswrapper[4962]: E0220 09:55:48.139495 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:48 crc kubenswrapper[4962]: E0220 09:55:48.139370 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:48 crc kubenswrapper[4962]: E0220 09:55:48.139636 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149714 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.149960 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.252662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.252977 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.253136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.253352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.253496 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.357661 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.460581 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564182 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.564213 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667407 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.667536 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.770936 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.874264 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:48 crc kubenswrapper[4962]: I0220 09:55:48.977406 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:48Z","lastTransitionTime":"2026-02-20T09:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.077923 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 14:42:31.265147247 +0000 UTC Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.080355 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.164443 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.183428 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.186873 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.206649 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.224509 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.241343 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.259938 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286065 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.286124 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.292373 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.328376 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.346215 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.361802 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.374703 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389257 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.389270 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.390152 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.402189 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.415856 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.428341 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.440389 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.454315 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.491705 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595108 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.595175 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.700174 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.802997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.803075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.803101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.803137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.803164 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.906871 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.906940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.906960 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.907000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:49 crc kubenswrapper[4962]: I0220 09:55:49.907023 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:49Z","lastTransitionTime":"2026-02-20T09:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011439 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.011511 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.078218 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 06:31:45.69149639 +0000 UTC Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.114413 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.138721 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.138868 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.138781 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.138779 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:50 crc kubenswrapper[4962]: E0220 09:55:50.139141 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:50 crc kubenswrapper[4962]: E0220 09:55:50.139334 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:50 crc kubenswrapper[4962]: E0220 09:55:50.139485 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:50 crc kubenswrapper[4962]: E0220 09:55:50.139665 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.218135 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320108 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.320141 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.423342 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527121 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.527200 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.630807 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733635 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733680 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.733695 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.836899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.836997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.837018 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.837047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.837067 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:50 crc kubenswrapper[4962]: I0220 09:55:50.940816 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:50Z","lastTransitionTime":"2026-02-20T09:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044193 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044513 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044634 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.044857 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.079534 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 18:13:35.979005054 +0000 UTC Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.147938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.148119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.148221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.148324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.148429 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251127 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251195 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251210 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.251253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.354412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.354909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.354938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.354969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.355004 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.459899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.459968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.459986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.460016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.460035 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.564340 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.649111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:51 crc kubenswrapper[4962]: E0220 09:55:51.649371 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:51 crc kubenswrapper[4962]: E0220 09:55:51.649522 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:55:59.649488121 +0000 UTC m=+51.231960007 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668241 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.668326 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.771406 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.875360 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:51 crc kubenswrapper[4962]: I0220 09:55:51.979638 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:51Z","lastTransitionTime":"2026-02-20T09:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.080792 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 01:22:06.20025467 +0000 UTC Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.083287 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.138464 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:52 crc kubenswrapper[4962]: E0220 09:55:52.138722 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.138757 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.138915 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:52 crc kubenswrapper[4962]: E0220 09:55:52.139121 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.139180 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:52 crc kubenswrapper[4962]: E0220 09:55:52.139238 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:52 crc kubenswrapper[4962]: E0220 09:55:52.139301 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186424 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.186498 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290061 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290125 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.290190 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394493 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.394841 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.497540 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.498104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.498131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.498155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.498170 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601680 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.601728 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705389 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.705441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.808562 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.915927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.916654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.916742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.916788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:52 crc kubenswrapper[4962]: I0220 09:55:52.916812 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:52Z","lastTransitionTime":"2026-02-20T09:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.020384 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.081488 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 15:23:05.928269323 +0000 UTC Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.123512 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227417 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227531 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.227551 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330497 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.330568 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433357 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433411 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.433515 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.536953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.537030 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.537053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.537089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.537113 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641529 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.641649 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.744830 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.848867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.848939 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.848958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.848987 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.849005 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:53 crc kubenswrapper[4962]: I0220 09:55:53.952877 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:53Z","lastTransitionTime":"2026-02-20T09:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.057198 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.082469 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 07:04:25.687432287 +0000 UTC Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.138201 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.138348 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.138381 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.138405 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.138348 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.138679 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.138972 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.139105 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.160952 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.264692 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.368790 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.471931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.471975 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.471992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.472014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.472027 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.574881 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.679536 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.783744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.784267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.784355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.784474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.784575 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888526 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.888584 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.948742 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:54 crc kubenswrapper[4962]: E0220 09:55:54.973327 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:54Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:54 crc kubenswrapper[4962]: I0220 09:55:54.986321 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:54Z","lastTransitionTime":"2026-02-20T09:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.011332 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:55Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.017279 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.041262 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:55Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.047870 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.068339 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:55Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.074648 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.083631 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 09:19:47.94018627 +0000 UTC Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.094396 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:55Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:55 crc kubenswrapper[4962]: E0220 09:55:55.094662 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097838 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.097912 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.201626 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305888 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.305907 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.409341 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.511906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.511993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.512012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.512042 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.512062 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615566 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.615580 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.719161 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.823617 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927437 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:55 crc kubenswrapper[4962]: I0220 09:55:55.927631 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:55Z","lastTransitionTime":"2026-02-20T09:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.031800 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.084561 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 23:55:31.053817176 +0000 UTC Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135669 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.135736 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.137887 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.138020 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.137889 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:56 crc kubenswrapper[4962]: E0220 09:55:56.138109 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.138033 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:56 crc kubenswrapper[4962]: E0220 09:55:56.138310 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:56 crc kubenswrapper[4962]: E0220 09:55:56.138469 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:56 crc kubenswrapper[4962]: E0220 09:55:56.138657 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.239386 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.343835 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.448262 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.557938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.558099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.558128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.558161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.558182 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662115 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.662180 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.765441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.869881 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.869944 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.869961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.869984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.870002 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973145 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973208 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:56 crc kubenswrapper[4962]: I0220 09:55:56.973270 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:56Z","lastTransitionTime":"2026-02-20T09:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077146 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.077240 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.085261 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 13:00:36.829955292 +0000 UTC Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.180991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.181103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.181136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.181180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.181208 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.285419 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.388978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.389048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.389066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.389093 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.389112 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.493194 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.596589 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.700396 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.803171 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.905998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.906081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.906104 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.906135 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:57 crc kubenswrapper[4962]: I0220 09:55:57.906155 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:57Z","lastTransitionTime":"2026-02-20T09:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.010698 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.085918 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 23:44:19.482540438 +0000 UTC Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114380 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114481 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.114550 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.138933 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.138933 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.138998 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.139167 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:58 crc kubenswrapper[4962]: E0220 09:55:58.139263 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:55:58 crc kubenswrapper[4962]: E0220 09:55:58.139448 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:55:58 crc kubenswrapper[4962]: E0220 09:55:58.139659 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:55:58 crc kubenswrapper[4962]: E0220 09:55:58.139808 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219500 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.219677 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.324210 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.427507 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.530668 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.634711 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738454 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.738510 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.841901 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945073 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:58 crc kubenswrapper[4962]: I0220 09:55:58.945192 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:58Z","lastTransitionTime":"2026-02-20T09:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.048352 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.086638 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 16:58:53.337580805 +0000 UTC Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.152530 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.161195 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.180490 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.207498 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.234230 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.254945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.255028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.255056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.255092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.255119 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.258679 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.278647 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.300070 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.315476 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.338925 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.358438 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.361903 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.376796 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.407098 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.445186 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.461833 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.473788 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.497328 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.523190 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.538283 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.565215 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.652695 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.652911 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.653016 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:15.652987722 +0000 UTC m=+67.235459598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669160 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.669320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.773382 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.817103 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.831288 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.831367 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.862012 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.875993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.876041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.876055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.876084 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.876100 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.897318 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.920857 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.939997 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.957916 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.958138 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958241 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958260 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:56:31.958210134 +0000 UTC m=+83.540682010 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958328 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:31.958303167 +0000 UTC m=+83.540775043 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.958429 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.958514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958737 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958799 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958856 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958873 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958915 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:31.958870665 +0000 UTC m=+83.541342571 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:55:59 crc kubenswrapper[4962]: E0220 09:55:59.958954 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:31.958938817 +0000 UTC m=+83.541410923 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.961457 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980216 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.980343 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:55:59Z","lastTransitionTime":"2026-02-20T09:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:55:59 crc kubenswrapper[4962]: I0220 09:55:59.984742 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:55:59Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.004677 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.024046 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.041938 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.057789 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.059775 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.060102 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.060171 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.060200 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.060309 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:32.060275955 +0000 UTC m=+83.642747841 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.070672 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.082582 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083637 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083684 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.083734 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.087250 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 16:04:06.111427565 +0000 UTC Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.108400 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.128766 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.138525 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.138705 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.138816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.138978 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.139041 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.139117 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.139207 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:00 crc kubenswrapper[4962]: E0220 09:56:00.139360 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.140584 4962 scope.go:117] "RemoveContainer" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.153131 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.173038 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186960 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.186984 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.290385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.290537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.290675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.291000 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.291165 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.397320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.501911 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.521388 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/1.log" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.525842 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.527002 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.553233 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.575550 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.599185 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605338 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.605383 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.622936 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.672155 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.702886 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.709223 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.724426 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.741364 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.757265 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.771653 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.785661 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.803166 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.812304 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.823741 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.837003 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.850917 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.867470 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.883741 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.896579 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:00Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.914947 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.915002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.915016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.915036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:00 crc kubenswrapper[4962]: I0220 09:56:00.915051 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:00Z","lastTransitionTime":"2026-02-20T09:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019137 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.019256 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.088259 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 12:40:06.042929018 +0000 UTC Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122088 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122108 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.122160 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.224757 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.328847 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433652 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.433674 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.534670 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/2.log" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.535834 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/1.log" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537107 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.537179 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.541462 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" exitCode=1 Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.541536 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.541652 4962 scope.go:117] "RemoveContainer" containerID="cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.542919 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:01 crc kubenswrapper[4962]: E0220 09:56:01.543252 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.568218 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.592544 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.616403 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.637662 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.640243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.640449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.640664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.640850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.641007 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.659052 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.677681 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.719914 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.744886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.744968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.744997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.745034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.745059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.749393 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.778922 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.800995 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.819719 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.839517 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.848905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.848979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.849003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.849031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.849052 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.862214 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.896631 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb7d5fb3e51f950f700ffdbd70535728803406441447065be85ac898397761b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"message\\\":\\\"ame:\\\\\\\"Service_openshift-kube-scheduler/scheduler_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler/scheduler\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.169\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 09:55:42.410270 6421 services_controller.go:360] Finished syncing service kubernetes on namespace default for network=default : 2.491089ms\\\\nI0220 09:55:42.410229 6421 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0220 09:55:42.410232 6421 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator-webhook for network=default\\\\nF0220 09:55:42.410409 6421 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.916513 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.938794 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.952565 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:01Z","lastTransitionTime":"2026-02-20T09:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.958256 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:01 crc kubenswrapper[4962]: I0220 09:56:01.975839 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:01Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057220 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.057270 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.089261 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 04:24:20.884442287 +0000 UTC Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.138372 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.138501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.138375 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.138529 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.138542 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.139060 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.139102 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.139215 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.161545 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.265206 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.369243 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.472989 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.548440 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/2.log" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.554153 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:02 crc kubenswrapper[4962]: E0220 09:56:02.554451 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.574998 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.576953 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.597911 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.634560 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.664830 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681150 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.681217 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.685125 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.706819 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.726318 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.745353 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.763907 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.784354 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.791868 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.807031 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.822496 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.836093 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.848507 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.867233 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.884236 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.892890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.893004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.893039 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.893089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.893120 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.908044 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.925846 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:02Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996936 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:02 crc kubenswrapper[4962]: I0220 09:56:02.996997 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:02Z","lastTransitionTime":"2026-02-20T09:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.090169 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 12:24:29.16089228 +0000 UTC Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100483 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.100558 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204136 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.204196 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308528 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308637 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308666 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.308735 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.415667 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519231 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519266 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.519291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.622925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.622985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.623002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.623027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.623046 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.726864 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.830840 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:03 crc kubenswrapper[4962]: I0220 09:56:03.934356 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:03Z","lastTransitionTime":"2026-02-20T09:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039639 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.039770 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.091018 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 23:18:20.472944933 +0000 UTC Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.138330 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.138359 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.138585 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:04 crc kubenswrapper[4962]: E0220 09:56:04.138665 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:04 crc kubenswrapper[4962]: E0220 09:56:04.138891 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:04 crc kubenswrapper[4962]: E0220 09:56:04.139091 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.139304 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:04 crc kubenswrapper[4962]: E0220 09:56:04.139744 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.142930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.143052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.143133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.143248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.143348 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247274 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.247343 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.350440 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454141 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.454179 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.558782 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.662121 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.766646 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870297 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.870360 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979737 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:04 crc kubenswrapper[4962]: I0220 09:56:04.979888 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:04Z","lastTransitionTime":"2026-02-20T09:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084358 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.084421 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.091549 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:27:14.20778628 +0000 UTC Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.187449 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.291545 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354548 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.354764 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.375706 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.382890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.382982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.383004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.383032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.383057 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.401866 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.407789 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.427209 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.431475 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.450341 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456074 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.456165 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.473669 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:05Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:05 crc kubenswrapper[4962]: E0220 09:56:05.473826 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.476283 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578717 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.578762 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.681817 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.785682 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.889185 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:05 crc kubenswrapper[4962]: I0220 09:56:05.993572 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:05Z","lastTransitionTime":"2026-02-20T09:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.092372 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 17:52:43.399918635 +0000 UTC Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.097554 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.138025 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.138120 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.138186 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.138318 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:06 crc kubenswrapper[4962]: E0220 09:56:06.138326 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:06 crc kubenswrapper[4962]: E0220 09:56:06.138485 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:06 crc kubenswrapper[4962]: E0220 09:56:06.138664 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:06 crc kubenswrapper[4962]: E0220 09:56:06.138784 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201524 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201655 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.201673 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305783 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.305901 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.409667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.410020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.410092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.410178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.410251 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.514150 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617582 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.617788 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.721746 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.824773 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927785 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927830 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:06 crc kubenswrapper[4962]: I0220 09:56:06.927850 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:06Z","lastTransitionTime":"2026-02-20T09:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.032195 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.093349 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 23:59:05.065333238 +0000 UTC Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.135831 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240365 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240435 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.240498 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.344443 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448762 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.448784 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552063 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552102 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.552128 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655424 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.655527 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.758290 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861493 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.861616 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:07 crc kubenswrapper[4962]: I0220 09:56:07.965410 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:07Z","lastTransitionTime":"2026-02-20T09:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.069310 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.094530 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 23:45:16.702331302 +0000 UTC Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.138334 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.138485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:08 crc kubenswrapper[4962]: E0220 09:56:08.138532 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.138495 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.138485 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:08 crc kubenswrapper[4962]: E0220 09:56:08.138764 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:08 crc kubenswrapper[4962]: E0220 09:56:08.138884 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:08 crc kubenswrapper[4962]: E0220 09:56:08.139064 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173118 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.173168 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282785 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282817 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.282874 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.388720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.389072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.389335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.389514 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.389752 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493810 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493887 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.493968 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.597458 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701439 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.701457 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.805665 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909214 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:08 crc kubenswrapper[4962]: I0220 09:56:08.909299 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:08Z","lastTransitionTime":"2026-02-20T09:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.013157 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.095781 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 04:21:55.159768185 +0000 UTC Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117108 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.117121 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.160068 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.177364 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.197667 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.220072 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221357 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.221391 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.255245 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.293302 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.311555 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.325518 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.328270 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.346382 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.362910 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.378529 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.394481 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.412093 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.427958 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.429635 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.444837 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.466272 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.489369 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.506732 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:09Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.532980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.533019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.533035 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.533059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.533074 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.636791 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740655 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.740789 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844356 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.844502 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:09 crc kubenswrapper[4962]: I0220 09:56:09.949464 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:09Z","lastTransitionTime":"2026-02-20T09:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053198 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.053263 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.096729 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 20:20:51.506824202 +0000 UTC Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.138486 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.138528 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:10 crc kubenswrapper[4962]: E0220 09:56:10.138773 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:10 crc kubenswrapper[4962]: E0220 09:56:10.138907 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.139124 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:10 crc kubenswrapper[4962]: E0220 09:56:10.139273 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.139565 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:10 crc kubenswrapper[4962]: E0220 09:56:10.139880 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.156940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.157001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.157020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.157054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.157078 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.260848 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.363822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.363936 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.363954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.363980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.364059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467921 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467937 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.467982 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.572318 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.675402 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.779933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.780001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.780018 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.780043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.780059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.882895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.882966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.882981 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.883361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.883383 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:10 crc kubenswrapper[4962]: I0220 09:56:10.987755 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:10Z","lastTransitionTime":"2026-02-20T09:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091870 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.091944 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.097256 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 04:44:22.325489916 +0000 UTC Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195295 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.195433 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297851 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297907 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.297974 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400624 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400700 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.400718 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504189 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504207 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.504251 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607303 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.607363 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710420 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.710439 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.814730 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918125 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:11 crc kubenswrapper[4962]: I0220 09:56:11.918173 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:11Z","lastTransitionTime":"2026-02-20T09:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.021313 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.097907 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 12:57:41.425293567 +0000 UTC Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124179 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124300 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.124319 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.138646 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.138713 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.138738 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.138678 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:12 crc kubenswrapper[4962]: E0220 09:56:12.138858 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:12 crc kubenswrapper[4962]: E0220 09:56:12.139029 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:12 crc kubenswrapper[4962]: E0220 09:56:12.139290 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:12 crc kubenswrapper[4962]: E0220 09:56:12.139322 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227288 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.227348 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330488 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330540 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.330633 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.433924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.433989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.434007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.434034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.434052 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.540478 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.643274 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.745545 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847716 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.847782 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950045 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950110 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:12 crc kubenswrapper[4962]: I0220 09:56:12.950124 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:12Z","lastTransitionTime":"2026-02-20T09:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052294 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052308 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.052337 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.099046 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 20:31:06.910593073 +0000 UTC Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.154253 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257470 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257481 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257499 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.257512 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.360834 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.463746 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567552 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.567624 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670431 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.670458 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.773462 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.876902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.876964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.876982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.877003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.877016 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979601 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:13 crc kubenswrapper[4962]: I0220 09:56:13.979638 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:13Z","lastTransitionTime":"2026-02-20T09:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082322 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082396 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.082425 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.099574 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:11:27.900707408 +0000 UTC Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.138683 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.138730 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.138760 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.138694 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:14 crc kubenswrapper[4962]: E0220 09:56:14.138841 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:14 crc kubenswrapper[4962]: E0220 09:56:14.138928 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:14 crc kubenswrapper[4962]: E0220 09:56:14.139020 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:14 crc kubenswrapper[4962]: E0220 09:56:14.139192 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.184417 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.288182 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.391491 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.493953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.494009 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.494022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.494043 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.494058 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596454 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.596559 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.698761 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.804767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.804917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.804962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.805004 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.805030 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908814 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:14 crc kubenswrapper[4962]: I0220 09:56:14.908931 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:14Z","lastTransitionTime":"2026-02-20T09:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011805 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011853 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.011901 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.099817 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 08:29:47.819138235 +0000 UTC Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114936 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.114989 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.139285 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.139461 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.218658 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321261 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.321302 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.424504 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.527287 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630349 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.630360 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.661115 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.661396 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.661471 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:56:47.661449782 +0000 UTC m=+99.243921668 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691844 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.691888 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.705697 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.709985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.710041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.710059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.710085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.710102 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.722749 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726869 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726903 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.726968 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.739337 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.744630 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.758681 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762294 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762311 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762333 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.762350 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.776389 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:15Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:15 crc kubenswrapper[4962]: E0220 09:56:15.776649 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.778781 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881576 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881619 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.881630 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985537 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:15 crc kubenswrapper[4962]: I0220 09:56:15.985672 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:15Z","lastTransitionTime":"2026-02-20T09:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098568 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.098617 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.100250 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 19:48:30.592522417 +0000 UTC Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.138682 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:16 crc kubenswrapper[4962]: E0220 09:56:16.138832 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.139035 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:16 crc kubenswrapper[4962]: E0220 09:56:16.139086 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.139186 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:16 crc kubenswrapper[4962]: E0220 09:56:16.139234 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.139345 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:16 crc kubenswrapper[4962]: E0220 09:56:16.139401 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.201366 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.302930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.302967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.302976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.302993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.303003 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.405928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.406016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.406032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.406054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.406073 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.508364 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610616 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.610630 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713681 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.713772 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816230 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.816275 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918777 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918835 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918853 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:16 crc kubenswrapper[4962]: I0220 09:56:16.918868 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:16Z","lastTransitionTime":"2026-02-20T09:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.021485 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.101243 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 12:53:44.286619126 +0000 UTC Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.124844 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227219 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.227281 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.329972 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.432766 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535324 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535437 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.535493 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.623276 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/0.log" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.623336 4962 generic.go:334] "Generic (PLEG): container finished" podID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" containerID="e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661" exitCode=1 Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.623377 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerDied","Data":"e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.623937 4962 scope.go:117] "RemoveContainer" containerID="e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.636739 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638572 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638865 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.638909 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.651225 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.665181 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.676805 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.690895 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.700987 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.714351 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.725294 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.741342 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.745774 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.766633 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.780629 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.792579 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.806647 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.820864 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.831376 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844804 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844898 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.844994 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.847193 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.859098 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.870765 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:17Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948232 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:17 crc kubenswrapper[4962]: I0220 09:56:17.948885 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:17Z","lastTransitionTime":"2026-02-20T09:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051947 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051959 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.051990 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.101838 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 19:33:47.892218583 +0000 UTC Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.138227 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:18 crc kubenswrapper[4962]: E0220 09:56:18.138666 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.138314 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.138344 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.138315 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:18 crc kubenswrapper[4962]: E0220 09:56:18.138988 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:18 crc kubenswrapper[4962]: E0220 09:56:18.139185 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:18 crc kubenswrapper[4962]: E0220 09:56:18.139305 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.154497 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257246 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.257260 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360847 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.360911 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.463924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.463999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.464024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.464059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.464081 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567131 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567212 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567238 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.567256 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.630028 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/0.log" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.630149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.650983 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.665662 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670458 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.670495 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.680158 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.698977 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.712051 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.722404 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.734369 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.752554 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.765643 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773169 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.773219 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.786678 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.814164 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.830184 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.843993 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.855565 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.867544 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876211 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.876224 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.885437 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.904107 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.916322 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:18Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.979955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.980002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.980015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.980033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:18 crc kubenswrapper[4962]: I0220 09:56:18.980043 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:18Z","lastTransitionTime":"2026-02-20T09:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082453 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.082487 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.102179 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 00:11:27.504331346 +0000 UTC Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.155835 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.170497 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.184829 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.188973 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.204362 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.221357 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.239861 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.255820 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.269621 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.281536 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287328 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.287422 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.296623 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.311769 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.338762 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.370513 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.387374 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.390647 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.390872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.390966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.391686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.391779 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.403444 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.415694 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.430522 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.442326 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:19Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.494803 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598193 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.598237 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.702101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.702492 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.702713 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.702892 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.703032 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805838 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.805880 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909194 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:19 crc kubenswrapper[4962]: I0220 09:56:19.909237 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:19Z","lastTransitionTime":"2026-02-20T09:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.011785 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.102300 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 23:22:32.85477297 +0000 UTC Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114530 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.114636 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.143212 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.143297 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.143514 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:20 crc kubenswrapper[4962]: E0220 09:56:20.143503 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.143678 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:20 crc kubenswrapper[4962]: E0220 09:56:20.143756 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:20 crc kubenswrapper[4962]: E0220 09:56:20.144052 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:20 crc kubenswrapper[4962]: E0220 09:56:20.144315 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217646 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.217684 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.320924 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423472 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.423653 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525775 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525846 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525868 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.525881 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.627972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.628020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.628033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.628053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.628067 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731610 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731634 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.731645 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834371 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834382 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.834412 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:20 crc kubenswrapper[4962]: I0220 09:56:20.937187 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:20Z","lastTransitionTime":"2026-02-20T09:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039116 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039147 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039173 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.039183 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.102834 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 09:48:00.542275573 +0000 UTC Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142164 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.142202 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244832 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.244987 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347886 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347896 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.347932 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.450897 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.450972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.450991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.451021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.451040 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.554174 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.656811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.759894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.759955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.759973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.759994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.760007 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863415 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863487 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863505 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.863557 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.966973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.967055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.967072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.967099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:21 crc kubenswrapper[4962]: I0220 09:56:21.967118 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:21Z","lastTransitionTime":"2026-02-20T09:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.070816 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.103859 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 21:52:05.407747362 +0000 UTC Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.138493 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:22 crc kubenswrapper[4962]: E0220 09:56:22.138704 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.138888 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.138926 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:22 crc kubenswrapper[4962]: E0220 09:56:22.138967 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.138971 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:22 crc kubenswrapper[4962]: E0220 09:56:22.139013 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:22 crc kubenswrapper[4962]: E0220 09:56:22.139087 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.172860 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.275445 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378523 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378679 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378713 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.378733 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.481823 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.584260 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687124 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687157 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.687167 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790146 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790239 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790270 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.790288 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894161 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894188 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:22 crc kubenswrapper[4962]: I0220 09:56:22.894207 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:22Z","lastTransitionTime":"2026-02-20T09:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000301 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.000326 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.103987 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 12:21:56.156205683 +0000 UTC Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104050 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.104899 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.153046 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207686 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207756 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.207821 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.311447 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.414204 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.516809 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618841 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618895 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.618906 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721696 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721772 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.721784 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.825463 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:23 crc kubenswrapper[4962]: I0220 09:56:23.928781 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:23Z","lastTransitionTime":"2026-02-20T09:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.032458 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.105677 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 11:51:08.115396288 +0000 UTC Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.134968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.135069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.135095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.135133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.135184 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.138362 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.138438 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:24 crc kubenswrapper[4962]: E0220 09:56:24.138538 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:24 crc kubenswrapper[4962]: E0220 09:56:24.138682 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.139074 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:24 crc kubenswrapper[4962]: E0220 09:56:24.139273 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.139337 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:24 crc kubenswrapper[4962]: E0220 09:56:24.139426 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238398 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238409 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.238441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342575 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.342739 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446314 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.446384 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549727 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.549807 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652256 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.652324 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755360 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.755435 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858515 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.858680 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:24 crc kubenswrapper[4962]: I0220 09:56:24.966838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:24Z","lastTransitionTime":"2026-02-20T09:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069518 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.069537 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.105916 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 00:21:49.696128383 +0000 UTC Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172542 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172642 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.172710 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.276708 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379811 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.379833 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483497 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.483537 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.586972 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690236 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.690369 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793554 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.793695 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.896955 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.897014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.897033 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.897059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:25 crc kubenswrapper[4962]: I0220 09:56:25.897081 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:25Z","lastTransitionTime":"2026-02-20T09:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000471 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.000490 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071436 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.071456 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.100702 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106190 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 18:10:03.095582296 +0000 UTC Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106473 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106522 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106541 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.106556 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.123101 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127464 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.127660 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.138511 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.138582 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.138715 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.138747 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.138909 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.138931 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.139621 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.139797 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.140516 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.149644 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.155129 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.175773 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.183373 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.207478 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: E0220 09:56:26.207652 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210283 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.210370 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314669 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.314751 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417794 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417806 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.417838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521672 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.521684 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.624972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.625049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.625072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.625096 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.625114 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.661020 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/2.log" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.664894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.665979 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.692522 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.710808 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728556 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728567 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728586 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.728618 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.729758 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.747475 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.764815 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.780545 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.794800 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.814907 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.830935 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.830979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.830989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.831006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.831016 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.838891 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.853333 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.868007 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.886490 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.899358 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.919239 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.933951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.934018 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.934034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.934058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.934074 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:26Z","lastTransitionTime":"2026-02-20T09:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.938693 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.952935 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.964104 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.974105 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:26 crc kubenswrapper[4962]: I0220 09:56:26.983772 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:26Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.037292 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.107051 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 04:21:47.202288435 +0000 UTC Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140008 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140085 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.140127 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.243694 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347304 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347425 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347456 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.347478 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451285 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451432 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451466 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.451487 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555086 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555178 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.555232 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658858 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.658902 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.671855 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.672921 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/2.log" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.676948 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" exitCode=1 Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.677017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.677089 4962 scope.go:117] "RemoveContainer" containerID="3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.678106 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:56:27 crc kubenswrapper[4962]: E0220 09:56:27.678381 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.699336 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.720587 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.746963 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762684 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762797 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.762820 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.780569 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3926877d9025cb33d86d3651716fce6ab1e9aa6467a094f0f6b463db0564bf0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:01Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196026 6624 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196302 6624 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196674 6624 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.196758 6624 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0220 09:56:01.197158 6624 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0220 09:56:01.197199 6624 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 09:56:01.197204 6624 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0220 09:56:01.197228 6624 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0220 09:56:01.197225 6624 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 09:56:01.197265 6624 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 09:56:01.197299 6624 factory.go:656] Stopping watch factory\\\\nI0220 09:56:01.197317 6624 ovnkube.go:599] Stopped ovnkube\\\\nI0220 09:56:01.197319 6624 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0220 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:27Z\\\",\\\"message\\\":\\\"ce-ca-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00716257f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0220 09:56:27.136040 7017 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.815870 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.839801 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.857724 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.866633 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.879423 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.899214 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.921896 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.939586 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.954902 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.968499 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970087 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.970100 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:27Z","lastTransitionTime":"2026-02-20T09:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:27 crc kubenswrapper[4962]: I0220 09:56:27.985473 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.000054 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:27Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.014012 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.037447 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.051710 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.066415 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073040 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073084 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.073133 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.107972 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 18:45:50.811630576 +0000 UTC Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.138649 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.138798 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.138813 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.138732 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.139059 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.139258 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.139452 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.139738 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175378 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175480 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175535 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.175556 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278882 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.278925 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.381973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.382026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.382041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.382064 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.382079 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.484876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.484943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.484967 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.485010 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.485030 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587723 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587801 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.587877 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.682522 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.686691 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:56:28 crc kubenswrapper[4962]: E0220 09:56:28.686944 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.690055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.690291 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.690435 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.690725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.691113 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.712661 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.737580 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.753557 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.773764 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.788351 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.794280 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.794519 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.794951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.795166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.795331 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.804912 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.819780 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.839185 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.855544 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.873752 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.889811 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898699 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.898787 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:28Z","lastTransitionTime":"2026-02-20T09:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.908006 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.930934 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:27Z\\\",\\\"message\\\":\\\"ce-ca-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00716257f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0220 09:56:27.136040 7017 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.958686 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.974056 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.985090 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:28 crc kubenswrapper[4962]: I0220 09:56:28.998610 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:28Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002512 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002533 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.002547 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.013192 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.026763 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.105529 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.108817 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:17:01.957953713 +0000 UTC Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.156153 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.176475 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.193573 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209093 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209468 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.209805 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.213338 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.235169 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.254838 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.287258 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:27Z\\\",\\\"message\\\":\\\"ce-ca-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00716257f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0220 09:56:27.136040 7017 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.308018 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312565 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312620 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.312657 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.321343 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.332893 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.345954 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.363367 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.374003 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.392928 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.414988 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415765 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.415782 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.427556 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.443888 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.457791 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.471535 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:29Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519287 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519331 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519342 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519362 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.519374 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.621779 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724412 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.724453 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.827430 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.827495 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.827914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.827982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.828265 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931180 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:29 crc kubenswrapper[4962]: I0220 09:56:29.931190 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:29Z","lastTransitionTime":"2026-02-20T09:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033507 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.033618 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.109430 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 21:23:22.05692129 +0000 UTC Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136312 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136322 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.136353 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.138688 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.138720 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.138739 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.138720 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:30 crc kubenswrapper[4962]: E0220 09:56:30.138794 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:30 crc kubenswrapper[4962]: E0220 09:56:30.138984 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:30 crc kubenswrapper[4962]: E0220 09:56:30.139146 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:30 crc kubenswrapper[4962]: E0220 09:56:30.139187 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.239358 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342557 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342589 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.342702 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.445923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.445985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.446007 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.446037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.446059 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549726 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.549873 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653278 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653705 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653786 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.653852 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757306 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757337 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.757360 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860354 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860433 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.860538 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:30 crc kubenswrapper[4962]: I0220 09:56:30.965245 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:30Z","lastTransitionTime":"2026-02-20T09:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.068994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.069062 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.069080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.069106 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.069127 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.110094 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:24:32.788895851 +0000 UTC Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171182 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171213 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171221 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.171242 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274519 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274579 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274621 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.274666 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377355 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.377432 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481607 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481660 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481695 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.481710 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585773 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585818 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.585838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688340 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688408 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.688421 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791267 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.791421 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.894772 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998368 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:31 crc kubenswrapper[4962]: I0220 09:56:31.998413 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:31Z","lastTransitionTime":"2026-02-20T09:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.045694 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.045837 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.045811087 +0000 UTC m=+147.628282943 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.046343 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.046537 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.046818 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.046560 4962 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047280 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.047257391 +0000 UTC m=+147.629729237 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.046685 4962 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047490 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.047478738 +0000 UTC m=+147.629950584 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.046914 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047627 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047641 4962 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.047668 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.047660274 +0000 UTC m=+147.630132120 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101655 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.101741 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.110994 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 08:58:52.429492638 +0000 UTC Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.138426 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.138450 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.138950 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.138533 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.139532 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.138484 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.139824 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.138942 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.148298 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.148557 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.148629 4962 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.148648 4962 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:32 crc kubenswrapper[4962]: E0220 09:56:32.148726 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.148704585 +0000 UTC m=+147.731176451 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.204950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.205032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.205051 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.205075 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.205092 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.307697 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.411908 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515749 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.515891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620730 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620745 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.620777 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.723894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.723957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.723976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.723997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.724013 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.826908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.826965 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.826978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.826997 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.827008 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930019 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930082 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:32 crc kubenswrapper[4962]: I0220 09:56:32.930157 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:32Z","lastTransitionTime":"2026-02-20T09:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034156 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034174 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.034217 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.111144 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 02:44:01.473678965 +0000 UTC Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.137924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.138014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.138037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.138091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.138108 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.243375 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347919 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.347942 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452090 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452167 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.452280 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.555949 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.556014 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.556025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.556046 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.556060 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.659211 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761864 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761891 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761900 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.761923 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864372 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864428 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.864438 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967194 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967218 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:33 crc kubenswrapper[4962]: I0220 09:56:33.967228 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:33Z","lastTransitionTime":"2026-02-20T09:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070255 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070374 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.070398 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.111872 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 09:48:56.618395552 +0000 UTC Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.138498 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.138652 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.138661 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.138666 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:34 crc kubenswrapper[4962]: E0220 09:56:34.138854 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:34 crc kubenswrapper[4962]: E0220 09:56:34.139076 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:34 crc kubenswrapper[4962]: E0220 09:56:34.139171 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:34 crc kubenswrapper[4962]: E0220 09:56:34.139482 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.172911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.172958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.172976 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.173015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.173041 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276902 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.276922 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380747 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380826 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380849 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.380862 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483612 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483722 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.483742 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586580 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.586837 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689843 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.689891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792836 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.792848 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895302 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895364 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895421 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.895440 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998954 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:34 crc kubenswrapper[4962]: I0220 09:56:34.998975 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:34Z","lastTransitionTime":"2026-02-20T09:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102376 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102441 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102479 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.102495 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.112772 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 06:17:49.663363892 +0000 UTC Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205119 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205205 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.205222 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.308573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.309177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.309199 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.309226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.309248 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412697 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412724 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.412785 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519227 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519259 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.519310 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622555 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.622902 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725504 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.725522 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828643 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.828662 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932617 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932677 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932713 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:35 crc kubenswrapper[4962]: I0220 09:56:35.932726 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:35Z","lastTransitionTime":"2026-02-20T09:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036081 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036151 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036176 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036209 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.036236 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.113521 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 03:04:34.405194703 +0000 UTC Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.137934 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.138024 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.138121 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.138113 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.138358 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.138343 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.138434 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.138689 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139704 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.139858 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243508 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243577 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243625 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243654 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.243680 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326703 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.326735 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.341029 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346714 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346758 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.346802 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.365777 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370770 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370845 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.370860 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.390327 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395578 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395709 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.395759 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.412707 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417634 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417643 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.417671 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.432325 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:36Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:36 crc kubenswrapper[4962]: E0220 09:56:36.432445 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.434984 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.435028 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.435041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.435066 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.435079 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.537943 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.538017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.538032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.538052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.538070 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.641721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.641819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.641848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.642072 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.642096 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745195 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745213 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.745262 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.847961 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.848036 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.848053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.848080 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.848098 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951055 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951091 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951099 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:36 crc kubenswrapper[4962]: I0220 09:56:36.951125 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:36Z","lastTransitionTime":"2026-02-20T09:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054381 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054399 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054424 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.054443 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.114258 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 00:13:16.314443481 +0000 UTC Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157083 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157142 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.157200 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.260964 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364825 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.364847 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467759 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.467787 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571385 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571478 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571506 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.571527 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675206 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675326 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.675441 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779580 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779704 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.779758 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.882890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.882982 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.883016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.883049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.883072 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.985931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.985985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.985995 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.986013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:37 crc kubenswrapper[4962]: I0220 09:56:37.986025 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:37Z","lastTransitionTime":"2026-02-20T09:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089139 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089204 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089223 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089250 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.089270 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.114871 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 07:02:21.068437752 +0000 UTC Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.138155 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.138279 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.138183 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:38 crc kubenswrapper[4962]: E0220 09:56:38.138430 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.138306 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:38 crc kubenswrapper[4962]: E0220 09:56:38.138512 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:38 crc kubenswrapper[4962]: E0220 09:56:38.138641 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:38 crc kubenswrapper[4962]: E0220 09:56:38.138784 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192507 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192667 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192698 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.192722 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296063 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296140 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.296189 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399863 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399929 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399946 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.399996 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.502854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.502931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.502951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.502985 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.503011 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607170 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607252 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.607286 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710631 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710732 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.710797 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814345 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814444 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.814503 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.918953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.919029 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.919047 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.919077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:38 crc kubenswrapper[4962]: I0220 09:56:38.919097 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:38Z","lastTransitionTime":"2026-02-20T09:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.022957 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.023038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.023067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.023103 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.023129 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.115086 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 16:15:16.386862125 +0000 UTC Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127175 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127194 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127224 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.127246 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.166031 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03611a46-7966-4587-950e-1d1f967c48c4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7a7a4444fd319ad977e6ac955aeb09b7fa9bc300586a60ba42b1ddb7c823b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b7038b1d182118313e4f3a5d272559ce949ae0b69f819883a0ce752314855b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://633d729c617b18ae2841e008b42a6a039b118aef8d357c2bbfaba8a445a417c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee212904f6af57b0dc28dc3da6bf037fb0dfb92937bd5789b8dcb03ea820f62f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.188145 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d590527b-ed56-4fb4-a712-b09781618a76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjn55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bwk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.210473 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7065599d0f9481d5becd2057dc2e557e5fb6f4ef22533c1e431708ee711c5aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230844 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230912 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230930 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.230989 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.232453 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://650ed0852746d520fbaa4ec277717c5af3916ba72fb82349af01bdd53ed1916b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5aaa8582764286e5a930fbc4754f253b044e997b856840685dde84db1aef3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.257948 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wqwgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1957ac70-30f9-48c2-a82b-72aa3b7a883a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:17Z\\\",\\\"message\\\":\\\"2026-02-20T09:55:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44\\\\n2026-02-20T09:55:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3f85854-3e0b-4926-bbae-cfacc0ecac44 to /host/opt/cni/bin/\\\\n2026-02-20T09:55:32Z [verbose] multus-daemon started\\\\n2026-02-20T09:55:32Z [verbose] Readiness Indicator file check\\\\n2026-02-20T09:56:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwxzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wqwgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.288102 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2abd2b70-bb78-49a0-b930-cd066384e803\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T09:56:27Z\\\",\\\"message\\\":\\\"ce-ca-operator] map[include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00716257f \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0220 09:56:27.136040 7017 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:56:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85mbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-99b2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.325863 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63edef29-013b-4e23-bd29-1e62958c425a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652942c0482deea28d2e07207dcdf38e3f5638a0eea5bb58b73671e39410b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d70ad241c203849bc1b2aa38409e9d5a16e13831e18a4f06bb2e5012b84910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a24ab72e4a25f2b29927eeb2552b6b091c3384a85ecedea7a4de9df19253ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9768dcc00aee095460ac27438e2ee85e824a9b21eee8c2fa137cd7de5de792ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55affbe6d7d089b636f77381908573fe22fb7a903776e26db373f50523280cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c59f145431d76c989efcafe88870326c97da4995fd4d4b3b380148affcc631\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bca4ae060ab8c063793a9d36e889453bbd85297566a767ccb7527a5579120f59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d4a397c3acf5d735ca38a20bd68acd53746b7a8158e56e10c91e88726271248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334734 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334819 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334848 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.334867 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.351231 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deab583c-05c7-4b7e-a3f6-c01081b17127\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"amed_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771581313\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771581313\\\\\\\\\\\\\\\" (2026-02-20 08:55:12 +0000 UTC to 2027-02-20 08:55:12 +0000 UTC (now=2026-02-20 09:55:28.067421835 +0000 UTC))\\\\\\\"\\\\nI0220 09:55:28.067484 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0220 09:55:28.067516 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0220 09:55:28.067549 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067613 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0220 09:55:28.067655 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-161790428/tls.crt::/tmp/serving-cert-161790428/tls.key\\\\\\\"\\\\nI0220 09:55:28.067781 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0220 09:55:28.067827 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067825 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0220 09:55:28.067865 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0220 09:55:28.067840 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0220 09:55:28.067995 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0220 09:55:28.068006 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF0220 09:55:28.068582 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.375193 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0820106bbb2993145f688614128fa51c5778e9953ee5fc00edba04ff06f3f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.397089 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.418262 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8526746c-450b-4df8-8ea1-f0cbabd13894\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c50de88e0429b7ecb3939db90dc49ca006cd7d071d9cc97beb31ca64028b9f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d5caed5e5e9e66911552cd6f1b7482cc842f5fc1b59863a208fe32ea87303d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tx5m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-htkbf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438279 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438448 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438564 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.438660 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.442896 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.463810 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s8xxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a431054f-57c5-41b7-93b2-2d2fbf9949ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15f0d587b054c3050286b2607687a4066cdd413e1623326cf355f9d93b10a2cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9fz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s8xxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.482458 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hxb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f0e53ce-e004-473e-be85-ef4c83e399c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e690a224056186e38fa1d7f83d5380f36a05bfff839ca53a87701a721c0c7c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c27br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hxb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.503170 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.526416 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de460dbc-9080-4eb3-b509-c43e81162de4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5051fa1280c3183976930435fb2b839acf93914df2c1e93cb28ae17755d534ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc5a98df99b3ac300355de8efa6cafb2182a2460152990a89b4125a7aef7b850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9264e0c637c9449882f25caf523f00112c372fad6125a2b4b1b541a515a897dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.543650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.544171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.544366 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.544632 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.544804 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.546043 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"096c4ebd-ac7b-45f6-abfa-5d54e4bce009\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa1dbe5648fbe736a165150e168243abc4486420bf78e560c86ec9cc6a608c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4192cbbb73f2b6d9657f2c58899df4c139b42d17fc4042cef80a7b9b05c5ef26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.570897 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.626526 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:39Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.647764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.648248 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.648363 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.648503 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.648646 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751494 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751590 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.751666 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855629 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855692 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.855758 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959022 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959077 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959098 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959128 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:39 crc kubenswrapper[4962]: I0220 09:56:39.959151 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:39Z","lastTransitionTime":"2026-02-20T09:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062776 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062872 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062950 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.062976 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.115811 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 00:54:23.86024088 +0000 UTC Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.138202 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.138263 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:40 crc kubenswrapper[4962]: E0220 09:56:40.138387 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.138426 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:40 crc kubenswrapper[4962]: E0220 09:56:40.138664 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.138926 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:40 crc kubenswrapper[4962]: E0220 09:56:40.139175 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:40 crc kubenswrapper[4962]: E0220 09:56:40.139489 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166021 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166112 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.166158 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273105 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273201 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273229 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273269 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.273307 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.378933 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.379026 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.379053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.379092 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.379119 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483263 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483293 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.483316 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586807 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.586828 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690361 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690434 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.690513 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794149 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794228 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794240 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794277 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.794291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897856 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897901 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:40 crc kubenswrapper[4962]: I0220 09:56:40.897921 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:40Z","lastTransitionTime":"2026-02-20T09:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001292 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001327 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.001356 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.104908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.104983 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.105003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.105032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.105052 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.116366 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 01:30:47.69192122 +0000 UTC Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208467 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.208621 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312320 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312390 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.312444 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415656 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.415847 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519370 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519490 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.519526 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622272 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622325 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622343 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622369 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.622389 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725243 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725264 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725296 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.725315 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829423 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829449 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.829509 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:41 crc kubenswrapper[4962]: I0220 09:56:41.933793 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:41Z","lastTransitionTime":"2026-02-20T09:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036768 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.036866 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.116868 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 16:05:39.426782801 +0000 UTC Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.138245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.138271 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.138245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:42 crc kubenswrapper[4962]: E0220 09:56:42.138427 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.138761 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:42 crc kubenswrapper[4962]: E0220 09:56:42.138799 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:42 crc kubenswrapper[4962]: E0220 09:56:42.138877 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:42 crc kubenswrapper[4962]: E0220 09:56:42.139014 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140284 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140330 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140347 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.140362 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243135 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243290 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.243317 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346587 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346676 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.346702 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450837 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450867 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.450891 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.558906 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.559044 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.559122 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.559216 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.559250 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662833 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662850 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.662892 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766251 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766339 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766359 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.766424 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870191 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870249 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870273 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870299 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.870320 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980816 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:42 crc kubenswrapper[4962]: I0220 09:56:42.980826 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:42Z","lastTransitionTime":"2026-02-20T09:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085499 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085551 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085583 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.085619 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.117872 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:22:36.739328911 +0000 UTC Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.140557 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:56:43 crc kubenswrapper[4962]: E0220 09:56:43.140756 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188629 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188793 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188861 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.188909 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292802 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292894 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292927 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.292952 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396655 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396728 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396753 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.396772 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.499973 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.500031 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.500048 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.500069 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.500086 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.603721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.604348 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.604539 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.604755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.604904 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708155 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708235 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708260 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.708277 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811181 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811268 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811335 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.811362 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.915123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.915553 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.915763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.915944 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:43 crc kubenswrapper[4962]: I0220 09:56:43.916091 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:43Z","lastTransitionTime":"2026-02-20T09:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020796 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020840 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.020858 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.118303 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 19:42:01.494335449 +0000 UTC Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125368 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125422 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125445 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.125529 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.138853 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.138928 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:44 crc kubenswrapper[4962]: E0220 09:56:44.139029 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.139027 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.138867 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:44 crc kubenswrapper[4962]: E0220 09:56:44.139238 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:44 crc kubenswrapper[4962]: E0220 09:56:44.139322 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:44 crc kubenswrapper[4962]: E0220 09:56:44.139437 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.228978 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.229416 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.229559 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.229760 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.229906 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333748 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333808 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333855 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.333875 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437678 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437751 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437774 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437804 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.437824 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.541885 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.541953 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.541972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.542005 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.542026 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645815 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645899 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645956 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.645979 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749706 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749780 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.749789 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853792 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853813 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.853859 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.957940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.957989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.958003 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.958032 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:44 crc kubenswrapper[4962]: I0220 09:56:44.958044 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:44Z","lastTransitionTime":"2026-02-20T09:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.061874 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.061958 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.061980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.062015 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.062040 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.119177 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 12:33:37.995070001 +0000 UTC Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.165644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.166052 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.166217 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.166383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.166529 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269461 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269547 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269569 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269650 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.269677 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373649 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373712 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373733 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.373781 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477323 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477350 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477383 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.477405 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.580873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.580947 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.580968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.580996 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.581015 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684544 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684653 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684708 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.684732 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788489 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.788521 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893146 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893213 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.893233 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998384 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998467 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998509 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:45 crc kubenswrapper[4962]: I0220 09:56:45.998528 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:45Z","lastTransitionTime":"2026-02-20T09:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102663 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102735 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102787 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.102807 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.119326 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 01:59:11.572428717 +0000 UTC Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.138133 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.138176 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.138218 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.138956 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.139298 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.139409 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.139674 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.140000 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206746 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206764 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206790 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.206811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309831 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309860 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.309882 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413809 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413839 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.413865 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458064 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458177 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458200 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.458260 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.482826 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490574 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.490642 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.513673 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520316 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520395 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520418 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520450 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.520469 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.542638 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548286 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.548310 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.570069 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577127 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.577171 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.601294 4962 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T09:56:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3742e1d0-bedb-4f62-b44c-22d7df4a090f\\\",\\\"systemUUID\\\":\\\"0de0f937-0896-4e78-90b7-d2d7a1bed2a9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:46Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:46 crc kubenswrapper[4962]: E0220 09:56:46.601636 4962 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604562 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604630 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604665 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.604691 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708196 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708262 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708289 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.708339 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811226 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811318 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811341 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.811358 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914674 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914741 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914763 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914789 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:46 crc kubenswrapper[4962]: I0220 09:56:46.914810 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:46Z","lastTransitionTime":"2026-02-20T09:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017750 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.017811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.119528 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 12:33:48.305472184 +0000 UTC Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120451 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120475 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120498 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.120520 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224234 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224351 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224379 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.224398 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327192 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.327214 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430354 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430377 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.430425 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533386 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533410 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533442 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.533463 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636443 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636474 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.636485 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:47 crc kubenswrapper[4962]: E0220 09:56:47.751542 4962 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:56:47 crc kubenswrapper[4962]: E0220 09:56:47.751667 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs podName:d590527b-ed56-4fb4-a712-b09781618a76 nodeName:}" failed. No retries permitted until 2026-02-20 09:57:51.751630788 +0000 UTC m=+163.334102674 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs") pod "network-metrics-daemon-5bwk2" (UID: "d590527b-ed56-4fb4-a712-b09781618a76") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751663 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751718 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751744 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751778 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.751801 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854469 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854512 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854525 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854545 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.854556 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957510 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:47 crc kubenswrapper[4962]: I0220 09:56:47.957565 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:47Z","lastTransitionTime":"2026-02-20T09:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.060909 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.060972 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.060991 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.061017 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.061036 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.120417 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 01:21:49.637571447 +0000 UTC Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.138827 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.138876 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.138983 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.138833 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:48 crc kubenswrapper[4962]: E0220 09:56:48.139018 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:48 crc kubenswrapper[4962]: E0220 09:56:48.139095 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:48 crc kubenswrapper[4962]: E0220 09:56:48.139173 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:48 crc kubenswrapper[4962]: E0220 09:56:48.139279 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164319 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164336 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.164352 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266673 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266689 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266711 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.266725 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.369969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.370027 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.370038 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.370058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.370072 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.472795 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.472866 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.472890 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.473016 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.473099 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577100 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.577240 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680664 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680721 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680740 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.680774 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783702 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783720 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783742 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.783757 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.887884 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.887949 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.887964 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.887994 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.888011 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991101 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991133 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991143 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991159 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:48 crc kubenswrapper[4962]: I0220 09:56:48.991170 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:48Z","lastTransitionTime":"2026-02-20T09:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094067 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094132 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094153 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.094167 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.121658 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 13:49:55.608593091 +0000 UTC Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.158367 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef72d73c-d177-4436-b681-83866e1f6d12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfad5fd92783e0af12f28bd81ccc67f1cf757d57723d98f8fea4f02dc0fea8b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7062fa53f9b3d49fbb98b733d3408979b188cd2a6dca2b91c4d87e03ab45e966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0829e26098d1deffed3da668516f6de46ddedfdc4898f8fc7489890def40bb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06fdabb296d1d3c66c516caca94bf3525a22356c4d0bd14c6d68c25096fa933f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a75fbf93f5205ba53e36c21f01b6dcdc1fab8fd4f8d9d251bf52ba8acf0d211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baff076c7da4f6a97e04b8c7d4af792334ef66574c53d712d62da6e47bb6aa84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db45da296dc56ea1f7508e2cf379395f550dcca974c8762a787c81da1fb475f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T09:55:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T09:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66n7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hj8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.171017 4962 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751d5e0b-919c-4777-8475-ed7214f7647f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T09:55:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b6a54fc4a37387541acda22f8ca81ddfe54e7a9ea682a31abf5e1ba2787f2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T09:55:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzq9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T09:55:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-m9d46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T09:56:49Z is after 2025-08-24T17:21:41Z" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197406 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197462 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197477 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.197511 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.207635 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.207619855 podStartE2EDuration="1m15.207619855s" podCreationTimestamp="2026-02-20 09:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.206065827 +0000 UTC m=+100.788537693" watchObservedRunningTime="2026-02-20 09:56:49.207619855 +0000 UTC m=+100.790091701" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.223617 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.223585945 podStartE2EDuration="26.223585945s" podCreationTimestamp="2026-02-20 09:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.223331437 +0000 UTC m=+100.805803283" watchObservedRunningTime="2026-02-20 09:56:49.223585945 +0000 UTC m=+100.806057791" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.282027 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=50.281998697 podStartE2EDuration="50.281998697s" podCreationTimestamp="2026-02-20 09:55:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.265873643 +0000 UTC m=+100.848345499" watchObservedRunningTime="2026-02-20 09:56:49.281998697 +0000 UTC m=+100.864470543" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301123 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301138 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301158 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.301173 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.369562 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wqwgj" podStartSLOduration=80.369544894 podStartE2EDuration="1m20.369544894s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.347031764 +0000 UTC m=+100.929503610" watchObservedRunningTime="2026-02-20 09:56:49.369544894 +0000 UTC m=+100.952016740" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.394669 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.394646625 podStartE2EDuration="1m17.394646625s" podCreationTimestamp="2026-02-20 09:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.393457808 +0000 UTC m=+100.975929654" watchObservedRunningTime="2026-02-20 09:56:49.394646625 +0000 UTC m=+100.977118471" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403693 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403710 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403731 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.403744 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.412361 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.412341437 podStartE2EDuration="1m21.412341437s" podCreationTimestamp="2026-02-20 09:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.411520733 +0000 UTC m=+100.993992579" watchObservedRunningTime="2026-02-20 09:56:49.412341437 +0000 UTC m=+100.994813283" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.442762 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-htkbf" podStartSLOduration=79.442737821 podStartE2EDuration="1m19.442737821s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.441286836 +0000 UTC m=+101.023758702" watchObservedRunningTime="2026-02-20 09:56:49.442737821 +0000 UTC m=+101.025209687" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.470860 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s8xxr" podStartSLOduration=80.470835502 podStartE2EDuration="1m20.470835502s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.470483472 +0000 UTC m=+101.052955318" watchObservedRunningTime="2026-02-20 09:56:49.470835502 +0000 UTC m=+101.053307358" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.482220 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hxb97" podStartSLOduration=80.482201181 podStartE2EDuration="1m20.482201181s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:49.481341416 +0000 UTC m=+101.063813262" watchObservedRunningTime="2026-02-20 09:56:49.482201181 +0000 UTC m=+101.064673027" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506397 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506414 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.506425 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608690 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608729 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608738 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608754 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.608767 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711060 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711097 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711111 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.711143 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.813496 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.814001 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.814185 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.814346 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.814497 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917725 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917769 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917779 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917800 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:49 crc kubenswrapper[4962]: I0220 09:56:49.917813 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:49Z","lastTransitionTime":"2026-02-20T09:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.200038 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.200032 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 18:55:41.369249611 +0000 UTC Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.200157 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:50 crc kubenswrapper[4962]: E0220 09:56:50.200392 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:50 crc kubenswrapper[4962]: E0220 09:56:50.200475 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.200703 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:50 crc kubenswrapper[4962]: E0220 09:56:50.200834 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.201550 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:50 crc kubenswrapper[4962]: E0220 09:56:50.201917 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202517 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202550 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202560 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202581 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.202608 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306691 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306767 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306788 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306834 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.306856 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410245 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410353 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410373 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410400 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.410424 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513715 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513771 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513781 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.513838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617352 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617367 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617388 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.617399 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720305 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720417 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720447 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.720470 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825636 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825677 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.825702 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.928878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.928970 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.928993 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.929025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:50 crc kubenswrapper[4962]: I0220 09:56:50.929054 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:50Z","lastTransitionTime":"2026-02-20T09:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031457 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031511 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031520 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031538 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.031548 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134109 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134154 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134166 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134186 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.134199 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.201254 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 02:11:05.588673063 +0000 UTC Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236339 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236429 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236460 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.236490 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.339917 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.339974 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.339992 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.340013 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.340026 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443392 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443405 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443426 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.443439 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546743 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546755 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546799 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.546811 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.649979 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.650024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.650037 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.650058 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.650073 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752873 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752915 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752924 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752939 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.752947 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856254 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856307 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856321 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856344 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.856359 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.958951 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.959006 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.959020 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.959041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:51 crc kubenswrapper[4962]: I0220 09:56:51.959057 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:51Z","lastTransitionTime":"2026-02-20T09:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061739 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061798 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061823 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061854 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.061875 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.138796 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.138844 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.138917 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.138804 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:52 crc kubenswrapper[4962]: E0220 09:56:52.139023 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:52 crc kubenswrapper[4962]: E0220 09:56:52.139137 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:52 crc kubenswrapper[4962]: E0220 09:56:52.139282 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:52 crc kubenswrapper[4962]: E0220 09:56:52.139395 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164452 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164491 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164501 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164516 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.164526 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.202026 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 08:54:31.973931217 +0000 UTC Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268527 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268640 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268662 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268694 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.268715 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372034 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372113 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372134 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372163 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.372183 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476791 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476820 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.476838 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.585315 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.585931 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.586413 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.586459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.586488 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691455 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691546 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691573 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691648 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.691676 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796165 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796247 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796271 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796317 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.796339 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.899969 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.900049 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.900068 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.900095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:52 crc kubenswrapper[4962]: I0220 09:56:52.900114 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:52Z","lastTransitionTime":"2026-02-20T09:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003659 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003719 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003736 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003757 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.003774 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107761 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107782 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107812 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.107833 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.203030 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 01:31:12.55139107 +0000 UTC Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210752 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210811 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210828 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210852 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.210864 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314822 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314893 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314911 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314940 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.314958 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419543 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419645 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419675 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.419729 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522766 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.522932 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627401 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627419 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.627464 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.731923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.732002 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.732025 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.732053 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.732078 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835611 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835661 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835688 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.835699 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939095 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939171 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939190 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939222 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:53 crc kubenswrapper[4962]: I0220 09:56:53.939246 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:53Z","lastTransitionTime":"2026-02-20T09:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042059 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042114 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042129 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042148 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.042163 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.138353 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.138472 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.138354 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.138424 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:54 crc kubenswrapper[4962]: E0220 09:56:54.138782 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:54 crc kubenswrapper[4962]: E0220 09:56:54.138962 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:54 crc kubenswrapper[4962]: E0220 09:56:54.139155 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:54 crc kubenswrapper[4962]: E0220 09:56:54.139320 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145558 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145641 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145658 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145685 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.145710 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.204059 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 00:17:40.592174691 +0000 UTC Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248532 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248643 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248668 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.248688 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.350916 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.350999 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.351023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.351056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.351074 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.453989 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.454044 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.454056 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.454078 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.454090 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556332 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556402 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556427 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.556444 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658561 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658615 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658627 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658644 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.658655 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761880 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761918 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761928 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761945 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.761955 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864253 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864446 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864459 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864476 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.864487 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967233 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967242 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967281 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:54 crc kubenswrapper[4962]: I0220 09:56:54.967291 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:54Z","lastTransitionTime":"2026-02-20T09:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069628 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069670 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069682 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.069713 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.139170 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:56:55 crc kubenswrapper[4962]: E0220 09:56:55.139400 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-99b2s_openshift-ovn-kubernetes(2abd2b70-bb78-49a0-b930-cd066384e803)\"" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172485 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172536 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172549 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172568 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.172581 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.204779 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 22:11:40.400795172 +0000 UTC Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274824 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274865 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274883 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.274896 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377298 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377309 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377334 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.377347 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479877 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479914 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479923 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479939 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.479950 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582570 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582614 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582623 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582638 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.582648 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.684968 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.685012 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.685024 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.685044 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.685057 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787803 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787842 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787857 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787878 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.787911 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890197 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890244 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890258 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890275 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.890287 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993089 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993144 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993152 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993184 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:55 crc kubenswrapper[4962]: I0220 09:56:55.993195 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:55Z","lastTransitionTime":"2026-02-20T09:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096875 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096925 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096938 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096962 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.096978 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.138339 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.138377 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.138415 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.138538 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:56 crc kubenswrapper[4962]: E0220 09:56:56.138716 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:56 crc kubenswrapper[4962]: E0220 09:56:56.138951 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:56 crc kubenswrapper[4962]: E0220 09:56:56.139055 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:56 crc kubenswrapper[4962]: E0220 09:56:56.138993 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200130 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200187 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200202 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200225 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.200240 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.205413 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 00:07:53.081491484 +0000 UTC Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.303904 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.303980 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.303998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.304023 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.304042 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406829 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406876 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406889 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406905 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.406916 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509329 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509375 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509387 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509404 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.509414 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.611908 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.611952 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.611966 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.611986 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.612001 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715120 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715172 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715183 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715203 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.715215 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.819998 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.820041 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.820054 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.820070 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.820080 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922584 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922671 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922687 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922707 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.922720 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958484 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958571 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958622 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958651 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 09:56:56 crc kubenswrapper[4962]: I0220 09:56:56.958671 4962 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T09:56:56Z","lastTransitionTime":"2026-02-20T09:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.013991 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz"] Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.014566 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.018012 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.018234 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.018261 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.018324 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.033816 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7hj8w" podStartSLOduration=88.033667606 podStartE2EDuration="1m28.033667606s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:57.032518901 +0000 UTC m=+108.614990747" watchObservedRunningTime="2026-02-20 09:56:57.033667606 +0000 UTC m=+108.616139452" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.048663 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podStartSLOduration=88.048632646 podStartE2EDuration="1m28.048632646s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:57.046950814 +0000 UTC m=+108.629422700" watchObservedRunningTime="2026-02-20 09:56:57.048632646 +0000 UTC m=+108.631104532" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.082922 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.082985 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5229176-0c7f-4323-87a3-b9a848df3af0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.083023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5229176-0c7f-4323-87a3-b9a848df3af0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.083123 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.083146 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5229176-0c7f-4323-87a3-b9a848df3af0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184236 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184282 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5229176-0c7f-4323-87a3-b9a848df3af0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184301 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5229176-0c7f-4323-87a3-b9a848df3af0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184352 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5229176-0c7f-4323-87a3-b9a848df3af0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184390 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.184415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c5229176-0c7f-4323-87a3-b9a848df3af0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.190605 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5229176-0c7f-4323-87a3-b9a848df3af0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.191477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5229176-0c7f-4323-87a3-b9a848df3af0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.205533 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 00:25:48.983256009 +0000 UTC Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.205730 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.206024 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5229176-0c7f-4323-87a3-b9a848df3af0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-chgrz\" (UID: \"c5229176-0c7f-4323-87a3-b9a848df3af0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.215019 4962 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.337708 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.789513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" event={"ID":"c5229176-0c7f-4323-87a3-b9a848df3af0","Type":"ContainerStarted","Data":"415608f9120a0f246bb97bd6289d0fa63e5ee629011761737cd2277a75e86e19"} Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.789560 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" event={"ID":"c5229176-0c7f-4323-87a3-b9a848df3af0","Type":"ContainerStarted","Data":"0601267d1dadf5d213a406f50a286f14eef4de193b7dc1765e78527715c629af"} Feb 20 09:56:57 crc kubenswrapper[4962]: I0220 09:56:57.809995 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-chgrz" podStartSLOduration=88.809970392 podStartE2EDuration="1m28.809970392s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:56:57.808203247 +0000 UTC m=+109.390675103" watchObservedRunningTime="2026-02-20 09:56:57.809970392 +0000 UTC m=+109.392442248" Feb 20 09:56:58 crc kubenswrapper[4962]: I0220 09:56:58.138043 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:56:58 crc kubenswrapper[4962]: I0220 09:56:58.138162 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:56:58 crc kubenswrapper[4962]: E0220 09:56:58.138189 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:56:58 crc kubenswrapper[4962]: I0220 09:56:58.138246 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:56:58 crc kubenswrapper[4962]: I0220 09:56:58.138293 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:56:58 crc kubenswrapper[4962]: E0220 09:56:58.138474 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:56:58 crc kubenswrapper[4962]: E0220 09:56:58.138682 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:56:58 crc kubenswrapper[4962]: E0220 09:56:58.138874 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:00 crc kubenswrapper[4962]: I0220 09:57:00.138244 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:00 crc kubenswrapper[4962]: I0220 09:57:00.138278 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:00 crc kubenswrapper[4962]: I0220 09:57:00.138335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:00 crc kubenswrapper[4962]: I0220 09:57:00.138641 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:00 crc kubenswrapper[4962]: E0220 09:57:00.138884 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:00 crc kubenswrapper[4962]: E0220 09:57:00.139001 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:00 crc kubenswrapper[4962]: E0220 09:57:00.139103 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:00 crc kubenswrapper[4962]: E0220 09:57:00.139214 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:02 crc kubenswrapper[4962]: I0220 09:57:02.138425 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:02 crc kubenswrapper[4962]: I0220 09:57:02.138496 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:02 crc kubenswrapper[4962]: I0220 09:57:02.138501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:02 crc kubenswrapper[4962]: E0220 09:57:02.138635 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:02 crc kubenswrapper[4962]: I0220 09:57:02.138673 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:02 crc kubenswrapper[4962]: E0220 09:57:02.138844 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:02 crc kubenswrapper[4962]: E0220 09:57:02.138962 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:02 crc kubenswrapper[4962]: E0220 09:57:02.139117 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.811309 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/1.log" Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.811854 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/0.log" Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.811907 4962 generic.go:334] "Generic (PLEG): container finished" podID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" containerID="330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67" exitCode=1 Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.811960 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerDied","Data":"330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67"} Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.812009 4962 scope.go:117] "RemoveContainer" containerID="e4f1f32eda1cc801a5b1f6d84207120c48c1bcca494a88bdfa95fb05bf82f661" Feb 20 09:57:03 crc kubenswrapper[4962]: I0220 09:57:03.812940 4962 scope.go:117] "RemoveContainer" containerID="330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67" Feb 20 09:57:03 crc kubenswrapper[4962]: E0220 09:57:03.813345 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wqwgj_openshift-multus(1957ac70-30f9-48c2-a82b-72aa3b7a883a)\"" pod="openshift-multus/multus-wqwgj" podUID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.138078 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.138124 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:04 crc kubenswrapper[4962]: E0220 09:57:04.138207 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.138364 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.138364 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:04 crc kubenswrapper[4962]: E0220 09:57:04.138523 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:04 crc kubenswrapper[4962]: E0220 09:57:04.138830 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:04 crc kubenswrapper[4962]: E0220 09:57:04.138920 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:04 crc kubenswrapper[4962]: I0220 09:57:04.818733 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/1.log" Feb 20 09:57:06 crc kubenswrapper[4962]: I0220 09:57:06.138211 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:06 crc kubenswrapper[4962]: E0220 09:57:06.138667 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:06 crc kubenswrapper[4962]: I0220 09:57:06.138297 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:06 crc kubenswrapper[4962]: E0220 09:57:06.138763 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:06 crc kubenswrapper[4962]: I0220 09:57:06.138326 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:06 crc kubenswrapper[4962]: I0220 09:57:06.138247 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:06 crc kubenswrapper[4962]: E0220 09:57:06.138890 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:06 crc kubenswrapper[4962]: E0220 09:57:06.138973 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.138709 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.138784 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.138854 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:08 crc kubenswrapper[4962]: E0220 09:57:08.139073 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.139151 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:08 crc kubenswrapper[4962]: E0220 09:57:08.139311 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:08 crc kubenswrapper[4962]: E0220 09:57:08.139496 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:08 crc kubenswrapper[4962]: E0220 09:57:08.139655 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.140260 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.834920 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.837957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerStarted","Data":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.838658 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:57:08 crc kubenswrapper[4962]: I0220 09:57:08.885229 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podStartSLOduration=99.885204429 podStartE2EDuration="1m39.885204429s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:08.884661012 +0000 UTC m=+120.467132898" watchObservedRunningTime="2026-02-20 09:57:08.885204429 +0000 UTC m=+120.467676295" Feb 20 09:57:09 crc kubenswrapper[4962]: I0220 09:57:09.025696 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bwk2"] Feb 20 09:57:09 crc kubenswrapper[4962]: I0220 09:57:09.025837 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:09 crc kubenswrapper[4962]: E0220 09:57:09.025960 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:09 crc kubenswrapper[4962]: E0220 09:57:09.106804 4962 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 20 09:57:09 crc kubenswrapper[4962]: E0220 09:57:09.230172 4962 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 09:57:10 crc kubenswrapper[4962]: I0220 09:57:10.138285 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:10 crc kubenswrapper[4962]: I0220 09:57:10.138355 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:10 crc kubenswrapper[4962]: I0220 09:57:10.138356 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:10 crc kubenswrapper[4962]: E0220 09:57:10.138455 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:10 crc kubenswrapper[4962]: E0220 09:57:10.138583 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:10 crc kubenswrapper[4962]: E0220 09:57:10.138734 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:11 crc kubenswrapper[4962]: I0220 09:57:11.137969 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:11 crc kubenswrapper[4962]: E0220 09:57:11.138177 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:12 crc kubenswrapper[4962]: I0220 09:57:12.138125 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:12 crc kubenswrapper[4962]: I0220 09:57:12.138271 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:12 crc kubenswrapper[4962]: I0220 09:57:12.138412 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:12 crc kubenswrapper[4962]: E0220 09:57:12.138583 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:12 crc kubenswrapper[4962]: E0220 09:57:12.138741 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:12 crc kubenswrapper[4962]: E0220 09:57:12.138809 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:13 crc kubenswrapper[4962]: I0220 09:57:13.138183 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:13 crc kubenswrapper[4962]: E0220 09:57:13.138386 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:14 crc kubenswrapper[4962]: I0220 09:57:14.137890 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:14 crc kubenswrapper[4962]: E0220 09:57:14.138085 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:14 crc kubenswrapper[4962]: I0220 09:57:14.138203 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:14 crc kubenswrapper[4962]: E0220 09:57:14.138294 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:14 crc kubenswrapper[4962]: I0220 09:57:14.138365 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:14 crc kubenswrapper[4962]: E0220 09:57:14.138441 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:14 crc kubenswrapper[4962]: E0220 09:57:14.232523 4962 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 09:57:15 crc kubenswrapper[4962]: I0220 09:57:15.138321 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:15 crc kubenswrapper[4962]: E0220 09:57:15.138536 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.138987 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.139196 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.139203 4962 scope.go:117] "RemoveContainer" containerID="330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.139042 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:16 crc kubenswrapper[4962]: E0220 09:57:16.139355 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:16 crc kubenswrapper[4962]: E0220 09:57:16.141124 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:16 crc kubenswrapper[4962]: E0220 09:57:16.141399 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.870956 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/1.log" Feb 20 09:57:16 crc kubenswrapper[4962]: I0220 09:57:16.871437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef"} Feb 20 09:57:17 crc kubenswrapper[4962]: I0220 09:57:17.138153 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:17 crc kubenswrapper[4962]: E0220 09:57:17.138329 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:17 crc kubenswrapper[4962]: I0220 09:57:17.432589 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 09:57:18 crc kubenswrapper[4962]: I0220 09:57:18.138414 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:18 crc kubenswrapper[4962]: E0220 09:57:18.138579 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 09:57:18 crc kubenswrapper[4962]: I0220 09:57:18.138576 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:18 crc kubenswrapper[4962]: I0220 09:57:18.138694 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:18 crc kubenswrapper[4962]: E0220 09:57:18.138941 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 09:57:18 crc kubenswrapper[4962]: E0220 09:57:18.139029 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 09:57:19 crc kubenswrapper[4962]: I0220 09:57:19.138100 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:19 crc kubenswrapper[4962]: E0220 09:57:19.139121 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bwk2" podUID="d590527b-ed56-4fb4-a712-b09781618a76" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.138740 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.138794 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.138979 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.142482 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.143881 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.144040 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 09:57:20 crc kubenswrapper[4962]: I0220 09:57:20.144201 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 09:57:21 crc kubenswrapper[4962]: I0220 09:57:21.138665 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:21 crc kubenswrapper[4962]: I0220 09:57:21.142783 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 09:57:21 crc kubenswrapper[4962]: I0220 09:57:21.142805 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.462701 4962 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.525413 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.526397 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.528984 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jtftl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.532913 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.533770 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.536939 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.537133 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547325 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547406 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547479 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547942 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.547957 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548461 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548672 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548791 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit-dir\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548807 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548859 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-serving-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548919 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-image-import-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.548958 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-serving-cert\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549013 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-node-pullsecrets\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549102 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549132 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-client\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549204 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-encryption-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549239 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79w4l\" (UniqueName: \"kubernetes.io/projected/37e7b911-da73-4f82-ad0c-d8707547b7a7-kube-api-access-79w4l\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.549499 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.550382 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.551245 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.560326 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.561126 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.561256 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.561718 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-k85np"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.562044 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tp9zq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.562219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.562583 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.562230 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.565672 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.566539 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.566968 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.567727 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.567756 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.567874 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.568943 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.569025 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.569073 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.577086 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.578269 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.579131 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.579654 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.580423 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.581494 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.581823 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.583667 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tv8j9"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.584380 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.584665 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5lf26"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.585225 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.587653 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.588101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.588536 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.589204 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.590776 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.591360 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.593853 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.593906 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.594369 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.594581 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.594857 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.595058 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.595375 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.595516 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.598518 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.599304 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.599450 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.599960 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600013 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600113 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600155 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600435 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600641 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600803 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600821 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.600954 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601035 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601096 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601159 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601246 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601336 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601420 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.601531 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.612077 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.612689 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.599961 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.618439 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.620711 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.621363 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.621569 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.621882 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.622187 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.622461 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.623320 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.625929 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635099 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635311 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635615 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635739 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.635865 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.636368 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.636509 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.636636 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.636895 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.637008 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.637183 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wndb7"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.638440 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.638745 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.638952 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.639134 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.639440 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.639665 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.640025 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.640259 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.640476 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.644066 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.645493 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.645840 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646415 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646513 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646563 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646690 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.646966 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647036 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647119 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647228 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647293 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647318 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647427 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647431 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647464 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647577 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647683 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647708 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.647987 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.649481 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.649687 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.649904 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.650383 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651368 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b55a13cf-03c6-46d9-b286-960a839b1558-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-node-pullsecrets\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651491 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-serving-cert\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651513 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfswd\" (UniqueName: \"kubernetes.io/projected/474b1e5d-9a6f-4931-be66-8fb20c82ac60-kube-api-access-nfswd\") pod \"migrator-59844c95c7-dc74p\" (UID: \"474b1e5d-9a6f-4931-be66-8fb20c82ac60\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651544 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.651562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.652899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.662965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-node-pullsecrets\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.663713 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.664016 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.665436 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.665901 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.666170 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.666273 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.669908 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98r7\" (UniqueName: \"kubernetes.io/projected/34f42578-fcc9-4539-add3-bca8deb6927b-kube-api-access-s98r7\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.669963 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34f42578-fcc9-4539-add3-bca8deb6927b-metrics-tls\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.674875 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677211 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79w4l\" (UniqueName: \"kubernetes.io/projected/37e7b911-da73-4f82-ad0c-d8707547b7a7-kube-api-access-79w4l\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-config\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677645 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677777 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5sb4\" (UniqueName: \"kubernetes.io/projected/77ff4d6a-8c1e-440f-a78c-900c09587848-kube-api-access-s5sb4\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677808 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677827 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrg46\" (UniqueName: \"kubernetes.io/projected/75c3ba8d-4548-4407-9188-a785ef05da2c-kube-api-access-lrg46\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-encryption-config\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677978 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit-dir\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678094 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678115 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678225 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678304 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678454 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-trusted-ca\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678716 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dpx6\" (UniqueName: \"kubernetes.io/projected/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-kube-api-access-8dpx6\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678741 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678783 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678880 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678902 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.678923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljp6k\" (UniqueName: \"kubernetes.io/projected/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-kube-api-access-ljp6k\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.679068 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-image-import-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.679773 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.679955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-proxy-tls\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680026 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225d1d1d-8168-4489-af91-6a87f28c39ed-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680060 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-serving-cert\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680081 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-client\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680098 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d962f6fe-d955-483d-b149-976a11dd4922-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680129 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdt5\" (UniqueName: \"kubernetes.io/projected/cba11394-4e55-4edc-beec-750bddabc1d0-kube-api-access-2cdt5\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680149 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680166 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/225d1d1d-8168-4489-af91-6a87f28c39ed-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680203 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-images\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680228 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680245 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680271 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680287 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-client\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680333 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-encryption-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/77ff4d6a-8c1e-440f-a78c-900c09587848-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680464 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cba11394-4e55-4edc-beec-750bddabc1d0-audit-dir\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680481 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225d1d1d-8168-4489-af91-6a87f28c39ed-config\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-audit-policies\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680567 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljckj\" (UniqueName: \"kubernetes.io/projected/b55a13cf-03c6-46d9-b286-960a839b1558-kube-api-access-ljckj\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680583 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.680642 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d962f6fe-d955-483d-b149-976a11dd4922-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.681161 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.681370 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.682628 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98nf\" (UniqueName: \"kubernetes.io/projected/7ef2d9f9-34f2-48a6-83eb-689c0fdcac66-kube-api-access-x98nf\") pod \"downloads-7954f5f757-tv8j9\" (UID: \"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66\") " pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.682783 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.682808 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.682826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-serving-cert\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.686637 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-image-import-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.677214 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tcwqj"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.687388 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.687781 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.687805 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ckmh2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.685565 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit-dir\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.689213 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.689678 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.684056 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.685018 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.690728 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.691795 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.692637 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.693168 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-audit\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.693959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75c3ba8d-4548-4407-9188-a785ef05da2c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.693996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-serving-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.694479 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smr5m\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-kube-api-access-smr5m\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.694644 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.694725 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.695739 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-serving-ca\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.701696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-etcd-client\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.701733 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.702264 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.702683 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.704454 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.705145 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707022 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707050 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707086 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707757 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.707847 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.713473 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37e7b911-da73-4f82-ad0c-d8707547b7a7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.715259 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.715833 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.716679 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-serving-cert\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.722191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37e7b911-da73-4f82-ad0c-d8707547b7a7-encryption-config\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.726518 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.727184 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.727749 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.728360 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.730877 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.732203 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.736361 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.742100 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.744673 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.750332 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.752045 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tv8j9"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.752911 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jtftl"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.753610 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.759051 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.760028 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.769424 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.772013 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7q8sx"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.773027 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.774367 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.775152 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.776465 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.777610 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.778522 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.780099 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lbvml"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.780714 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.782828 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.783992 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tp9zq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.785031 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.788245 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4mw9f"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.790620 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.792543 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.792616 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.792756 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.793292 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7nh4t"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.794558 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-l92fq"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.795153 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.795217 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796060 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75c3ba8d-4548-4407-9188-a785ef05da2c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796684 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-serving-cert\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796772 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smr5m\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-kube-api-access-smr5m\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796925 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfswd\" (UniqueName: \"kubernetes.io/projected/474b1e5d-9a6f-4931-be66-8fb20c82ac60-kube-api-access-nfswd\") pod \"migrator-59844c95c7-dc74p\" (UID: \"474b1e5d-9a6f-4931-be66-8fb20c82ac60\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796951 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b55a13cf-03c6-46d9-b286-960a839b1558-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.796974 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-serving-cert\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797017 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797079 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797104 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98r7\" (UniqueName: \"kubernetes.io/projected/34f42578-fcc9-4539-add3-bca8deb6927b-kube-api-access-s98r7\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797160 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34f42578-fcc9-4539-add3-bca8deb6927b-metrics-tls\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797200 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-config\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5sb4\" (UniqueName: \"kubernetes.io/projected/77ff4d6a-8c1e-440f-a78c-900c09587848-kube-api-access-s5sb4\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797255 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797279 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrg46\" (UniqueName: \"kubernetes.io/projected/75c3ba8d-4548-4407-9188-a785ef05da2c-kube-api-access-lrg46\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797319 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-encryption-config\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797349 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dpx6\" (UniqueName: \"kubernetes.io/projected/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-kube-api-access-8dpx6\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797431 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797484 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-trusted-ca\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797506 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797535 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797559 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797582 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljp6k\" (UniqueName: \"kubernetes.io/projected/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-kube-api-access-ljp6k\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797626 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-proxy-tls\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797655 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d962f6fe-d955-483d-b149-976a11dd4922-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225d1d1d-8168-4489-af91-6a87f28c39ed-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-client\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797750 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdt5\" (UniqueName: \"kubernetes.io/projected/cba11394-4e55-4edc-beec-750bddabc1d0-kube-api-access-2cdt5\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797780 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/225d1d1d-8168-4489-af91-6a87f28c39ed-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797842 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-images\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797908 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797935 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.797969 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798000 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/77ff4d6a-8c1e-440f-a78c-900c09587848-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225d1d1d-8168-4489-af91-6a87f28c39ed-config\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cba11394-4e55-4edc-beec-750bddabc1d0-audit-dir\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798078 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798359 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-audit-policies\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d962f6fe-d955-483d-b149-976a11dd4922-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljckj\" (UniqueName: \"kubernetes.io/projected/b55a13cf-03c6-46d9-b286-960a839b1558-kube-api-access-ljckj\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.798530 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98nf\" (UniqueName: \"kubernetes.io/projected/7ef2d9f9-34f2-48a6-83eb-689c0fdcac66-kube-api-access-x98nf\") pod \"downloads-7954f5f757-tv8j9\" (UID: \"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66\") " pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799331 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-audit-policies\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.799791 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.800336 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ckmh2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.800429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.800681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801080 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801166 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.800353 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801448 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-config\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801540 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d962f6fe-d955-483d-b149-976a11dd4922-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.801805 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-k85np"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.802766 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cba11394-4e55-4edc-beec-750bddabc1d0-audit-dir\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.803744 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.803786 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-encryption-config\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.803761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.803940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804138 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-serving-cert\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-serving-cert\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804114 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.804477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-images\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.805337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225d1d1d-8168-4489-af91-6a87f28c39ed-config\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.805534 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-trusted-ca\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.805983 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b55a13cf-03c6-46d9-b286-960a839b1558-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.805997 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5lf26"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806067 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806218 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806316 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34f42578-fcc9-4539-add3-bca8deb6927b-metrics-tls\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cba11394-4e55-4edc-beec-750bddabc1d0-etcd-client\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.806759 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.807030 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.807038 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.807171 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.808356 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.808487 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.808741 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.809496 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/225d1d1d-8168-4489-af91-6a87f28c39ed-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.810264 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.811048 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.813433 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.813640 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-proxy-tls\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.814307 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.814412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d962f6fe-d955-483d-b149-976a11dd4922-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.815947 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.817099 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.818218 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.818910 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.819253 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.820529 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.821684 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.822737 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.823814 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7nh4t"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.825063 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lbvml"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.875283 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.875548 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.875367 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.878290 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.879498 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.883291 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wndb7"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.884950 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.886088 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7q8sx"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.888065 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.889245 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.890839 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.893698 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.897100 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4mw9f"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.898745 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7tj4j"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.899653 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.900842 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7tj4j"] Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.909132 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.928341 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.948351 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.969319 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 09:57:27 crc kubenswrapper[4962]: I0220 09:57:27.989379 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.001869 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/75c3ba8d-4548-4407-9188-a785ef05da2c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.008818 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.028702 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.048354 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.069365 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.089212 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.098002 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/77ff4d6a-8c1e-440f-a78c-900c09587848-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.109544 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.129562 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.168165 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.188733 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.208737 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.262883 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79w4l\" (UniqueName: \"kubernetes.io/projected/37e7b911-da73-4f82-ad0c-d8707547b7a7-kube-api-access-79w4l\") pod \"apiserver-76f77b778f-jtftl\" (UID: \"37e7b911-da73-4f82-ad0c-d8707547b7a7\") " pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.269861 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.288260 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.309305 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.329309 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.348997 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.369105 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.389329 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.409358 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.429436 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.448811 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.469189 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.489293 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.496666 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.510387 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.529452 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.548963 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.569138 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.589936 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.614985 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.629220 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.649894 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.669050 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.689131 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.710524 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.730077 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.742878 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jtftl"] Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.747580 4962 request.go:700] Waited for 1.015016016s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackage-server-manager-serving-cert&limit=500&resourceVersion=0 Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.749549 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.768815 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.789794 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.810165 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.830379 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.850160 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.869523 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.888632 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.909608 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.914358 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" event={"ID":"37e7b911-da73-4f82-ad0c-d8707547b7a7","Type":"ContainerStarted","Data":"6f9b750267fbada324f415b4eab8ccc588bfee1c71de79cbab7087db44b8d785"} Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.939694 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.948312 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.969509 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 09:57:28 crc kubenswrapper[4962]: I0220 09:57:28.989819 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.009509 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.028652 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.049065 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.070465 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.090417 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.109778 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.128565 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.149715 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.169194 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.190005 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.209332 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.228667 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.250890 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.270257 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.289784 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.309919 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.329388 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.349003 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.369487 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.389101 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.411100 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.428724 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.449842 4962 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.469483 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.522319 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smr5m\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-kube-api-access-smr5m\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.526082 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d962f6fe-d955-483d-b149-976a11dd4922-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4k8pc\" (UID: \"d962f6fe-d955-483d-b149-976a11dd4922\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.529836 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.564495 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfswd\" (UniqueName: \"kubernetes.io/projected/474b1e5d-9a6f-4931-be66-8fb20c82ac60-kube-api-access-nfswd\") pod \"migrator-59844c95c7-dc74p\" (UID: \"474b1e5d-9a6f-4931-be66-8fb20c82ac60\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.582086 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.588947 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98nf\" (UniqueName: \"kubernetes.io/projected/7ef2d9f9-34f2-48a6-83eb-689c0fdcac66-kube-api-access-x98nf\") pod \"downloads-7954f5f757-tv8j9\" (UID: \"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66\") " pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.599189 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.619648 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/225d1d1d-8168-4489-af91-6a87f28c39ed-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gqbxv\" (UID: \"225d1d1d-8168-4489-af91-6a87f28c39ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.623700 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98r7\" (UniqueName: \"kubernetes.io/projected/34f42578-fcc9-4539-add3-bca8deb6927b-kube-api-access-s98r7\") pod \"dns-operator-744455d44c-5lf26\" (UID: \"34f42578-fcc9-4539-add3-bca8deb6927b\") " pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.655540 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") pod \"controller-manager-879f6c89f-szbwm\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.681062 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrg46\" (UniqueName: \"kubernetes.io/projected/75c3ba8d-4548-4407-9188-a785ef05da2c-kube-api-access-lrg46\") pod \"control-plane-machine-set-operator-78cbb6b69f-hc9h5\" (UID: \"75c3ba8d-4548-4407-9188-a785ef05da2c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.683229 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.688864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljp6k\" (UniqueName: \"kubernetes.io/projected/68c1fde1-72ce-4ce0-ade8-9c8e7016464c-kube-api-access-ljp6k\") pod \"machine-config-operator-74547568cd-8t82g\" (UID: \"68c1fde1-72ce-4ce0-ade8-9c8e7016464c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.690528 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.699620 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.708845 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.710926 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") pod \"route-controller-manager-6576b87f9c-758rq\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.714993 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.734568 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.735801 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.745792 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") pod \"oauth-openshift-558db77b4-mrzbm\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.750410 4962 request.go:700] Waited for 1.948759718s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/serviceaccounts/multus-ac/token Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.763419 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dpx6\" (UniqueName: \"kubernetes.io/projected/dce6ddda-3fcf-40bd-a085-a09f0bb811bf-kube-api-access-8dpx6\") pod \"console-operator-58897d9998-k85np\" (UID: \"dce6ddda-3fcf-40bd-a085-a09f0bb811bf\") " pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.769418 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5sb4\" (UniqueName: \"kubernetes.io/projected/77ff4d6a-8c1e-440f-a78c-900c09587848-kube-api-access-s5sb4\") pod \"multus-admission-controller-857f4d67dd-wndb7\" (UID: \"77ff4d6a-8c1e-440f-a78c-900c09587848\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.800400 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljckj\" (UniqueName: \"kubernetes.io/projected/b55a13cf-03c6-46d9-b286-960a839b1558-kube-api-access-ljckj\") pod \"cluster-samples-operator-665b6dd947-8jt7t\" (UID: \"b55a13cf-03c6-46d9-b286-960a839b1558\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.806315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdt5\" (UniqueName: \"kubernetes.io/projected/cba11394-4e55-4edc-beec-750bddabc1d0-kube-api-access-2cdt5\") pod \"apiserver-7bbb656c7d-vp5tl\" (UID: \"cba11394-4e55-4edc-beec-750bddabc1d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.809475 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.828461 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.845463 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.858922 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.859007 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.866832 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.885689 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.889521 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.910812 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc"] Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.928385 4962 generic.go:334] "Generic (PLEG): container finished" podID="37e7b911-da73-4f82-ad0c-d8707547b7a7" containerID="9350a8e12c71e3a008abd7f495bb1ba136c90c080869abe3047b6b06cdcbfe9a" exitCode=0 Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.928561 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" event={"ID":"37e7b911-da73-4f82-ad0c-d8707547b7a7","Type":"ContainerDied","Data":"9350a8e12c71e3a008abd7f495bb1ba136c90c080869abe3047b6b06cdcbfe9a"} Feb 20 09:57:29 crc kubenswrapper[4962]: W0220 09:57:29.950032 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd962f6fe_d955_483d_b149_976a11dd4922.slice/crio-9b0c121e4cc8abab256f58ac54535860c0c99b918fb4ea374e13156c3f11b3ae WatchSource:0}: Error finding container 9b0c121e4cc8abab256f58ac54535860c0c99b918fb4ea374e13156c3f11b3ae: Status 404 returned error can't find the container with id 9b0c121e4cc8abab256f58ac54535860c0c99b918fb4ea374e13156c3f11b3ae Feb 20 09:57:29 crc kubenswrapper[4962]: I0220 09:57:29.961995 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034462 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034565 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034585 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034631 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac7d606a-36a8-4608-918c-ed88eaf93a6d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034647 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1d0fd4e8-ba15-4d2f-9602-e887819ea423-machine-approver-tls\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034664 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034685 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmq2c\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-kube-api-access-pmq2c\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034838 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phch9\" (UniqueName: \"kubernetes.io/projected/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-kube-api-access-phch9\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.034967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.035314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.035713 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.035869 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036829 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g2cv\" (UniqueName: \"kubernetes.io/projected/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-kube-api-access-8g2cv\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036869 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qxqx\" (UniqueName: \"kubernetes.io/projected/0e4e18be-a43b-492a-981e-b4f9aebff1ab-kube-api-access-7qxqx\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036886 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7d606a-36a8-4608-918c-ed88eaf93a6d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.036967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-serving-cert\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037010 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037027 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037042 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037095 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-auth-proxy-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037113 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037641 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh2t2\" (UniqueName: \"kubernetes.io/projected/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-kube-api-access-sh2t2\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.037998 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038031 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3febff6-f15f-4ce8-825c-37d86b13c56d-trusted-ca\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038091 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4e18be-a43b-492a-981e-b4f9aebff1ab-serving-cert\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038136 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038154 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6nlc\" (UniqueName: \"kubernetes.io/projected/1d0fd4e8-ba15-4d2f-9602-e887819ea423-kube-api-access-c6nlc\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038216 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-config\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038232 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e4e18be-a43b-492a-981e-b4f9aebff1ab-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038266 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgr4s\" (UniqueName: \"kubernetes.io/projected/ac7d606a-36a8-4608-918c-ed88eaf93a6d-kube-api-access-lgr4s\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038300 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-service-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038332 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038380 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.038412 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3febff6-f15f-4ce8-825c-37d86b13c56d-metrics-tls\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.044816 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.051643 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.551551288 +0000 UTC m=+142.134023324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.147454 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.147719 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.647689484 +0000 UTC m=+142.230161320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-node-bootstrap-token\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148211 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g2cv\" (UniqueName: \"kubernetes.io/projected/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-kube-api-access-8g2cv\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148263 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qxqx\" (UniqueName: \"kubernetes.io/projected/0e4e18be-a43b-492a-981e-b4f9aebff1ab-kube-api-access-7qxqx\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148351 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7d606a-36a8-4608-918c-ed88eaf93a6d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148373 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674f40ed-74ed-48c2-8036-087ce9e16c94-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148389 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplfv\" (UniqueName: \"kubernetes.io/projected/319cf696-9a12-40dc-9f4a-d80fab9a97f8-kube-api-access-hplfv\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148487 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qbw\" (UniqueName: \"kubernetes.io/projected/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-kube-api-access-s9qbw\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148509 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-stats-auth\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148542 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrw8j\" (UniqueName: \"kubernetes.io/projected/8cb06d17-6188-4cca-84b7-f3d03abb20e8-kube-api-access-hrw8j\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148614 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148630 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-client\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3febff6-f15f-4ce8-825c-37d86b13c56d-trusted-ca\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148682 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674f40ed-74ed-48c2-8036-087ce9e16c94-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148712 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqfmw\" (UniqueName: \"kubernetes.io/projected/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-kube-api-access-zqfmw\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148745 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4e18be-a43b-492a-981e-b4f9aebff1ab-serving-cert\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-mountpoint-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148794 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-config-volume\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148854 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-srv-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.148988 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6nlc\" (UniqueName: \"kubernetes.io/projected/1d0fd4e8-ba15-4d2f-9602-e887819ea423-kube-api-access-c6nlc\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149011 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149028 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvc5z\" (UniqueName: \"kubernetes.io/projected/7e5e4942-63be-4811-8aaa-d6b53a427541-kube-api-access-gvc5z\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149847 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-certs\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149890 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-socket-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.149914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0240e440-4be2-4607-99c4-636b65e78081-signing-cabundle\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.150486 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.150727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151120 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-config\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151143 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-config\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e5e4942-63be-4811-8aaa-d6b53a427541-service-ca-bundle\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151225 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-registration-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151385 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3721fc4d-6f04-458e-a74c-0fe816908414-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.151564 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.651555796 +0000 UTC m=+142.234027642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151638 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3febff6-f15f-4ce8-825c-37d86b13c56d-metrics-tls\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151680 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-profile-collector-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151705 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0240e440-4be2-4607-99c4-636b65e78081-signing-key\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.151747 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.152833 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3febff6-f15f-4ce8-825c-37d86b13c56d-trusted-ca\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154030 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154098 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154267 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154386 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154441 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-plugins-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154480 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-srv-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154519 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e4e18be-a43b-492a-981e-b4f9aebff1ab-serving-cert\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154760 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckdpz\" (UniqueName: \"kubernetes.io/projected/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-kube-api-access-ckdpz\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154830 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pg8k\" (UniqueName: \"kubernetes.io/projected/32025b2b-9232-449f-b7bc-582d81d76430-kube-api-access-2pg8k\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154864 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7zbm\" (UniqueName: \"kubernetes.io/projected/3721fc4d-6f04-458e-a74c-0fe816908414-kube-api-access-v7zbm\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.154922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155293 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2knp\" (UniqueName: \"kubernetes.io/projected/daef1622-b612-4661-bb6a-63c5997d9a07-kube-api-access-j2knp\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155660 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.155881 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7d606a-36a8-4608-918c-ed88eaf93a6d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.156067 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.156132 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.156155 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-webhook-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.157163 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.157923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-serving-cert\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.157977 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-serving-cert\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158348 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158453 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158577 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-auth-proxy-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158610 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158661 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.158997 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh2t2\" (UniqueName: \"kubernetes.io/projected/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-kube-api-access-sh2t2\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319cf696-9a12-40dc-9f4a-d80fab9a97f8-cert\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159317 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvk5g\" (UniqueName: \"kubernetes.io/projected/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-kube-api-access-qvk5g\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-csi-data-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159375 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159384 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d0fd4e8-ba15-4d2f-9602-e887819ea423-auth-proxy-config\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159409 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-serving-cert\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159412 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159634 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xxtf\" (UniqueName: \"kubernetes.io/projected/0240e440-4be2-4607-99c4-636b65e78081-kube-api-access-5xxtf\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159666 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-config\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159696 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e4e18be-a43b-492a-981e-b4f9aebff1ab-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjtth\" (UniqueName: \"kubernetes.io/projected/9c993e86-3068-4d07-84b3-655f8308b7ed-kube-api-access-kjtth\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159736 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-metrics-certs\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159771 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674f40ed-74ed-48c2-8036-087ce9e16c94-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159791 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-service-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.159815 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgr4s\" (UniqueName: \"kubernetes.io/projected/ac7d606a-36a8-4608-918c-ed88eaf93a6d-kube-api-access-lgr4s\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160110 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e4e18be-a43b-492a-981e-b4f9aebff1ab-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160291 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-config\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160394 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160416 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-service-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160458 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-images\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160476 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-metrics-tls\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160570 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160675 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160796 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160837 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-apiservice-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.160868 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-service-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161556 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jgvn\" (UniqueName: \"kubernetes.io/projected/a7a9fa76-da75-4847-a539-d1e6bb57da98-kube-api-access-9jgvn\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161583 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161615 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9ffd\" (UniqueName: \"kubernetes.io/projected/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-kube-api-access-l9ffd\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161647 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161663 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161702 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac7d606a-36a8-4608-918c-ed88eaf93a6d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c993e86-3068-4d07-84b3-655f8308b7ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161764 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1d0fd4e8-ba15-4d2f-9602-e887819ea423-machine-approver-tls\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161782 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-tmpfs\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161823 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161855 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmq2c\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-kube-api-access-pmq2c\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161873 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-default-certificate\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161893 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-config\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161929 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phch9\" (UniqueName: \"kubernetes.io/projected/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-kube-api-access-phch9\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161945 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-config\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7a9fa76-da75-4847-a539-d1e6bb57da98-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.161977 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3721fc4d-6f04-458e-a74c-0fe816908414-proxy-tls\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.162013 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.162962 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.165581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.165615 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.171507 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.176926 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-serving-cert\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.176965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac7d606a-36a8-4608-918c-ed88eaf93a6d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.178077 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3febff6-f15f-4ce8-825c-37d86b13c56d-metrics-tls\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.181943 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1d0fd4e8-ba15-4d2f-9602-e887819ea423-machine-approver-tls\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.182145 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tv8j9"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.191868 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g2cv\" (UniqueName: \"kubernetes.io/projected/9e86193f-b3bb-42a8-bccb-00e0cbcbf432-kube-api-access-8g2cv\") pod \"authentication-operator-69f744f599-tp9zq\" (UID: \"9e86193f-b3bb-42a8-bccb-00e0cbcbf432\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.191974 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod225d1d1d_8168_4489_af91_6a87f28c39ed.slice/crio-979988254c99b59b69c15f468591a0b229b094bd2ae9aa467a9aa5dbc5efbaaa WatchSource:0}: Error finding container 979988254c99b59b69c15f468591a0b229b094bd2ae9aa467a9aa5dbc5efbaaa: Status 404 returned error can't find the container with id 979988254c99b59b69c15f468591a0b229b094bd2ae9aa467a9aa5dbc5efbaaa Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.209387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6nlc\" (UniqueName: \"kubernetes.io/projected/1d0fd4e8-ba15-4d2f-9602-e887819ea423-kube-api-access-c6nlc\") pod \"machine-approver-56656f9798-gqpsl\" (UID: \"1d0fd4e8-ba15-4d2f-9602-e887819ea423\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.241188 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.262840 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.262930 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-profile-collector-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.262956 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0240e440-4be2-4607-99c4-636b65e78081-signing-key\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.262984 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-plugins-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263000 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-srv-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263017 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263033 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdpz\" (UniqueName: \"kubernetes.io/projected/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-kube-api-access-ckdpz\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263049 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pg8k\" (UniqueName: \"kubernetes.io/projected/32025b2b-9232-449f-b7bc-582d81d76430-kube-api-access-2pg8k\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263065 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7zbm\" (UniqueName: \"kubernetes.io/projected/3721fc4d-6f04-458e-a74c-0fe816908414-kube-api-access-v7zbm\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263090 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2knp\" (UniqueName: \"kubernetes.io/projected/daef1622-b612-4661-bb6a-63c5997d9a07-kube-api-access-j2knp\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-webhook-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-serving-cert\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263176 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319cf696-9a12-40dc-9f4a-d80fab9a97f8-cert\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263194 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263217 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvk5g\" (UniqueName: \"kubernetes.io/projected/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-kube-api-access-qvk5g\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263231 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-csi-data-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-serving-cert\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263274 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xxtf\" (UniqueName: \"kubernetes.io/projected/0240e440-4be2-4607-99c4-636b65e78081-kube-api-access-5xxtf\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263292 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtth\" (UniqueName: \"kubernetes.io/projected/9c993e86-3068-4d07-84b3-655f8308b7ed-kube-api-access-kjtth\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263318 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-metrics-certs\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263336 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674f40ed-74ed-48c2-8036-087ce9e16c94-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263351 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-service-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263374 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-images\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-metrics-tls\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263410 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-apiservice-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263428 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263456 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jgvn\" (UniqueName: \"kubernetes.io/projected/a7a9fa76-da75-4847-a539-d1e6bb57da98-kube-api-access-9jgvn\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263477 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263496 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9ffd\" (UniqueName: \"kubernetes.io/projected/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-kube-api-access-l9ffd\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c993e86-3068-4d07-84b3-655f8308b7ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263549 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-tmpfs\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263580 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-default-certificate\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263614 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-config\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.263644 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-config\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.265363 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.765331801 +0000 UTC m=+142.347803637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.265771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-config\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.266748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-serving-cert\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.267441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7a9fa76-da75-4847-a539-d1e6bb57da98-images\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.270247 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-service-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.270250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-config\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.270536 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7a9fa76-da75-4847-a539-d1e6bb57da98-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271160 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3721fc4d-6f04-458e-a74c-0fe816908414-proxy-tls\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271197 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-node-bootstrap-token\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271272 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271272 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-metrics-certs\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674f40ed-74ed-48c2-8036-087ce9e16c94-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271328 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplfv\" (UniqueName: \"kubernetes.io/projected/319cf696-9a12-40dc-9f4a-d80fab9a97f8-kube-api-access-hplfv\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271356 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qbw\" (UniqueName: \"kubernetes.io/projected/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-kube-api-access-s9qbw\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-stats-auth\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271411 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrw8j\" (UniqueName: \"kubernetes.io/projected/8cb06d17-6188-4cca-84b7-f3d03abb20e8-kube-api-access-hrw8j\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271470 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-client\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271500 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674f40ed-74ed-48c2-8036-087ce9e16c94-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271538 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqfmw\" (UniqueName: \"kubernetes.io/projected/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-kube-api-access-zqfmw\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271561 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-mountpoint-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271621 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-config-volume\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271642 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-srv-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271680 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271708 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvc5z\" (UniqueName: \"kubernetes.io/projected/7e5e4942-63be-4811-8aaa-d6b53a427541-kube-api-access-gvc5z\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0240e440-4be2-4607-99c4-636b65e78081-signing-cabundle\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271761 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-certs\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271756 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-csi-data-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-socket-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-config\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-config\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271855 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e5e4942-63be-4811-8aaa-d6b53a427541-service-ca-bundle\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271874 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-registration-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271903 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3721fc4d-6f04-458e-a74c-0fe816908414-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.272472 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-metrics-tls\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.273176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3721fc4d-6f04-458e-a74c-0fe816908414-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.271098 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-plugins-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.270826 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-tmpfs\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.273829 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-mountpoint-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.274432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674f40ed-74ed-48c2-8036-087ce9e16c94-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.276804 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.277260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-ca\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.277919 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-config\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.278101 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-registration-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.278225 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-srv-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.278531 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.778516489 +0000 UTC m=+142.360988335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.279068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e5e4942-63be-4811-8aaa-d6b53a427541-service-ca-bundle\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.280242 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-config-volume\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.281328 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319cf696-9a12-40dc-9f4a-d80fab9a97f8-cert\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.281663 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-config\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-serving-cert\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282096 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-webhook-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282786 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/daef1622-b612-4661-bb6a-63c5997d9a07-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.282944 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c993e86-3068-4d07-84b3-655f8308b7ed-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.283286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-etcd-client\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.283564 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0240e440-4be2-4607-99c4-636b65e78081-signing-cabundle\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.283671 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cb06d17-6188-4cca-84b7-f3d03abb20e8-socket-dir\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.283709 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.284423 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-certs\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.284711 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/32025b2b-9232-449f-b7bc-582d81d76430-node-bootstrap-token\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.286829 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.287636 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-stats-auth\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.287759 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3721fc4d-6f04-458e-a74c-0fe816908414-proxy-tls\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.287796 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.291218 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-srv-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.291217 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-profile-collector-cert\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.292206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qxqx\" (UniqueName: \"kubernetes.io/projected/0e4e18be-a43b-492a-981e-b4f9aebff1ab-kube-api-access-7qxqx\") pod \"openshift-config-operator-7777fb866f-xkmtn\" (UID: \"0e4e18be-a43b-492a-981e-b4f9aebff1ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.292469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e5e4942-63be-4811-8aaa-d6b53a427541-default-certificate\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.293734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") pod \"console-f9d7485db-nwfk6\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.294335 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0240e440-4be2-4607-99c4-636b65e78081-signing-key\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.295013 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7a9fa76-da75-4847-a539-d1e6bb57da98-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.297181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674f40ed-74ed-48c2-8036-087ce9e16c94-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.297324 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-apiservice-cert\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.304926 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.310096 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh2t2\" (UniqueName: \"kubernetes.io/projected/e60f8ca8-5b2f-4b5c-930f-19caf45014ba-kube-api-access-sh2t2\") pod \"openshift-apiserver-operator-796bbdcf4f-txw2z\" (UID: \"e60f8ca8-5b2f-4b5c-930f-19caf45014ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.318858 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c1fde1_72ce_4ce0_ade8_9c8e7016464c.slice/crio-e8d37a6a57fda0f8fde14fa0441e67ccd1acf7446ab5259542d4c716ddc9be67 WatchSource:0}: Error finding container e8d37a6a57fda0f8fde14fa0441e67ccd1acf7446ab5259542d4c716ddc9be67: Status 404 returned error can't find the container with id e8d37a6a57fda0f8fde14fa0441e67ccd1acf7446ab5259542d4c716ddc9be67 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.321020 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.327638 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgr4s\" (UniqueName: \"kubernetes.io/projected/ac7d606a-36a8-4608-918c-ed88eaf93a6d-kube-api-access-lgr4s\") pod \"kube-storage-version-migrator-operator-b67b599dd-hqbh2\" (UID: \"ac7d606a-36a8-4608-918c-ed88eaf93a6d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.337806 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5lf26"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.345185 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmq2c\" (UniqueName: \"kubernetes.io/projected/c3febff6-f15f-4ce8-825c-37d86b13c56d-kube-api-access-pmq2c\") pod \"ingress-operator-5b745b69d9-5nqkf\" (UID: \"c3febff6-f15f-4ce8-825c-37d86b13c56d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.346763 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34f42578_fcc9_4539_add3_bca8deb6927b.slice/crio-266c0ffa64080b0de26767431ef3a6eaf68c965c0e6155bbfba299065fa5d499 WatchSource:0}: Error finding container 266c0ffa64080b0de26767431ef3a6eaf68c965c0e6155bbfba299065fa5d499: Status 404 returned error can't find the container with id 266c0ffa64080b0de26767431ef3a6eaf68c965c0e6155bbfba299065fa5d499 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.351821 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.366011 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.376488 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.377033 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.390997 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.890955251 +0000 UTC m=+142.473427097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.391187 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.393829 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.398494 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.401276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.408252 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.411414 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phch9\" (UniqueName: \"kubernetes.io/projected/cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1-kube-api-access-phch9\") pod \"openshift-controller-manager-operator-756b6f6bc6-bmp44\" (UID: \"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.428771 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.431362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xxtf\" (UniqueName: \"kubernetes.io/projected/0240e440-4be2-4607-99c4-636b65e78081-kube-api-access-5xxtf\") pod \"service-ca-9c57cc56f-7q8sx\" (UID: \"0240e440-4be2-4607-99c4-636b65e78081\") " pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.434099 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474b1e5d_9a6f_4931_be66_8fb20c82ac60.slice/crio-9828c0062499d2fa3278ca8dc82309caab70298604132cea50fbc62b70815ba1 WatchSource:0}: Error finding container 9828c0062499d2fa3278ca8dc82309caab70298604132cea50fbc62b70815ba1: Status 404 returned error can't find the container with id 9828c0062499d2fa3278ca8dc82309caab70298604132cea50fbc62b70815ba1 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.451885 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wndb7"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.453697 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjtth\" (UniqueName: \"kubernetes.io/projected/9c993e86-3068-4d07-84b3-655f8308b7ed-kube-api-access-kjtth\") pod \"package-server-manager-789f6589d5-mxcd2\" (UID: \"9c993e86-3068-4d07-84b3-655f8308b7ed\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.466783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jgvn\" (UniqueName: \"kubernetes.io/projected/a7a9fa76-da75-4847-a539-d1e6bb57da98-kube-api-access-9jgvn\") pod \"machine-api-operator-5694c8668f-ckmh2\" (UID: \"a7a9fa76-da75-4847-a539-d1e6bb57da98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.474837 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.480004 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.480364 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:30.980332672 +0000 UTC m=+142.562804708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.481306 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77ff4d6a_8c1e_440f_a78c_900c09587848.slice/crio-efff1e8c2910c51e6d22e0d46f64557daaaa8ef4ff3bf6b1f9553b0d1f8ca4b1 WatchSource:0}: Error finding container efff1e8c2910c51e6d22e0d46f64557daaaa8ef4ff3bf6b1f9553b0d1f8ca4b1: Status 404 returned error can't find the container with id efff1e8c2910c51e6d22e0d46f64557daaaa8ef4ff3bf6b1f9553b0d1f8ca4b1 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.482724 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/674f40ed-74ed-48c2-8036-087ce9e16c94-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6dwbb\" (UID: \"674f40ed-74ed-48c2-8036-087ce9e16c94\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.494407 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.505293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") pod \"marketplace-operator-79b997595-m7z5r\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.527365 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2knp\" (UniqueName: \"kubernetes.io/projected/daef1622-b612-4661-bb6a-63c5997d9a07-kube-api-access-j2knp\") pod \"olm-operator-6b444d44fb-rsf8j\" (UID: \"daef1622-b612-4661-bb6a-63c5997d9a07\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.557181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckdpz\" (UniqueName: \"kubernetes.io/projected/b9e0a083-b7e8-4b81-ad1a-03f587f2f46c-kube-api-access-ckdpz\") pod \"dns-default-4mw9f\" (UID: \"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c\") " pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.563138 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d0fd4e8_ba15_4d2f_9602_e887819ea423.slice/crio-7978d084b8b8a0d04cf89ebef1ecc3a95cb54ae2d704f90d8c3142f909490fc2 WatchSource:0}: Error finding container 7978d084b8b8a0d04cf89ebef1ecc3a95cb54ae2d704f90d8c3142f909490fc2: Status 404 returned error can't find the container with id 7978d084b8b8a0d04cf89ebef1ecc3a95cb54ae2d704f90d8c3142f909490fc2 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.563162 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.572631 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-k85np"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.576379 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pg8k\" (UniqueName: \"kubernetes.io/projected/32025b2b-9232-449f-b7bc-582d81d76430-kube-api-access-2pg8k\") pod \"machine-config-server-l92fq\" (UID: \"32025b2b-9232-449f-b7bc-582d81d76430\") " pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.580815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.581278 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.08126269 +0000 UTC m=+142.663734536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.581378 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.597330 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.598744 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.600346 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7zbm\" (UniqueName: \"kubernetes.io/projected/3721fc4d-6f04-458e-a74c-0fe816908414-kube-api-access-v7zbm\") pod \"machine-config-controller-84d6567774-wd68v\" (UID: \"3721fc4d-6f04-458e-a74c-0fe816908414\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.609874 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.621092 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvk5g\" (UniqueName: \"kubernetes.io/projected/a90b20e7-a8bc-4b8d-b407-f4f31fc96528-kube-api-access-qvk5g\") pod \"service-ca-operator-777779d784-v2nvr\" (UID: \"a90b20e7-a8bc-4b8d-b407-f4f31fc96528\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.624453 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.625284 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c85c4ba-4bcb-4449-bd63-320f2ff6a116-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xxdr9\" (UID: \"7c85c4ba-4bcb-4449-bd63-320f2ff6a116\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.647054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9ffd\" (UniqueName: \"kubernetes.io/projected/bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800-kube-api-access-l9ffd\") pod \"catalog-operator-68c6474976-g6nc2\" (UID: \"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.664176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") pod \"collect-profiles-29526345-4v6dw\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.665084 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.672417 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.679728 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.682522 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.682905 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.18288746 +0000 UTC m=+142.765359306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.686440 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.688505 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplfv\" (UniqueName: \"kubernetes.io/projected/319cf696-9a12-40dc-9f4a-d80fab9a97f8-kube-api-access-hplfv\") pod \"ingress-canary-7tj4j\" (UID: \"319cf696-9a12-40dc-9f4a-d80fab9a97f8\") " pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.702725 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.708320 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qbw\" (UniqueName: \"kubernetes.io/projected/5df91f4a-70e8-4036-8ab1-d917af6c8aa4-kube-api-access-s9qbw\") pod \"packageserver-d55dfcdfc-965lm\" (UID: \"5df91f4a-70e8-4036-8ab1-d917af6c8aa4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.708758 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.714779 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.721791 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.726702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrw8j\" (UniqueName: \"kubernetes.io/projected/8cb06d17-6188-4cca-84b7-f3d03abb20e8-kube-api-access-hrw8j\") pod \"csi-hostpathplugin-7nh4t\" (UID: \"8cb06d17-6188-4cca-84b7-f3d03abb20e8\") " pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.730434 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.733664 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z"] Feb 20 09:57:30 crc kubenswrapper[4962]: W0220 09:57:30.734774 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8161f87_3814_4d02_84ff_b94b8b05c59e.slice/crio-5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877 WatchSource:0}: Error finding container 5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877: Status 404 returned error can't find the container with id 5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877 Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.736539 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.749401 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.755995 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqfmw\" (UniqueName: \"kubernetes.io/projected/28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b-kube-api-access-zqfmw\") pod \"etcd-operator-b45778765-lbvml\" (UID: \"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.762741 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.771545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvc5z\" (UniqueName: \"kubernetes.io/projected/7e5e4942-63be-4811-8aaa-d6b53a427541-kube-api-access-gvc5z\") pod \"router-default-5444994796-tcwqj\" (UID: \"7e5e4942-63be-4811-8aaa-d6b53a427541\") " pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.783277 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.783474 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.783699 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.283683703 +0000 UTC m=+142.866155549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.793570 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-l92fq" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.800963 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7tj4j" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.885609 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.886126 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.386107878 +0000 UTC m=+142.968579724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.940261 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" event={"ID":"6adbe475-48f9-4ba3-82bd-b36bcd939168","Type":"ContainerStarted","Data":"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.940346 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" event={"ID":"6adbe475-48f9-4ba3-82bd-b36bcd939168","Type":"ContainerStarted","Data":"3bea97da1320becf13fecaed38868cc74c4f54c7308979ccb795e3bbe8eacf06"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.940946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.947620 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tv8j9" event={"ID":"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66","Type":"ContainerStarted","Data":"d7df83fe690c7504f718b1b7a49e7fb1d729ec2cbbbbbf7011a63fe6ad057d6b"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.947691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tv8j9" event={"ID":"7ef2d9f9-34f2-48a6-83eb-689c0fdcac66","Type":"ContainerStarted","Data":"391beeaa328a3138fd621ffd774f8909b7ab17f8cb49f6d3a5048995abcee98c"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.948867 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.953105 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" event={"ID":"75c3ba8d-4548-4407-9188-a785ef05da2c","Type":"ContainerStarted","Data":"7511c197cf19c9bdfdd3db34b5c3ac3859b3a69eee3be1ed5e0a4ecc5cbfc156"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.958319 4962 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-szbwm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.958342 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.958380 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.958380 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.959030 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.959614 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" event={"ID":"474b1e5d-9a6f-4931-be66-8fb20c82ac60","Type":"ContainerStarted","Data":"9828c0062499d2fa3278ca8dc82309caab70298604132cea50fbc62b70815ba1"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.961608 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tp9zq"] Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.966753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" event={"ID":"77ff4d6a-8c1e-440f-a78c-900c09587848","Type":"ContainerStarted","Data":"efff1e8c2910c51e6d22e0d46f64557daaaa8ef4ff3bf6b1f9553b0d1f8ca4b1"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.968149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" event={"ID":"ac7d606a-36a8-4608-918c-ed88eaf93a6d","Type":"ContainerStarted","Data":"661bc4f85372d8f949cd023db00242c6c96a2e76d1b45cee6f14cb90bb9f7255"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.976550 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" event={"ID":"8da2028c-f296-4f44-b010-b3abec9f6b98","Type":"ContainerStarted","Data":"c9ca7261143890db86b7247b8197f46263302fc4c677314a7e1a1eadf9f9acf2"} Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.989225 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:30 crc kubenswrapper[4962]: E0220 09:57:30.991046 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.491027642 +0000 UTC m=+143.073499488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:30 crc kubenswrapper[4962]: I0220 09:57:30.996327 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.007790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" event={"ID":"34f42578-fcc9-4539-add3-bca8deb6927b","Type":"ContainerStarted","Data":"266c0ffa64080b0de26767431ef3a6eaf68c965c0e6155bbfba299065fa5d499"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.039115 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" event={"ID":"37e7b911-da73-4f82-ad0c-d8707547b7a7","Type":"ContainerStarted","Data":"7015c1e84ed6e18e3b8cde213cbeb55a16a919347d04f6b8478889b3d4e6940a"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.053139 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-k85np" event={"ID":"dce6ddda-3fcf-40bd-a085-a09f0bb811bf","Type":"ContainerStarted","Data":"3cf4d667ac59419a36246906693804cbe14bf8d01c86c599bfb479efa95801d7"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.055056 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" event={"ID":"e60f8ca8-5b2f-4b5c-930f-19caf45014ba","Type":"ContainerStarted","Data":"10f5a9c41b0882c8fded0f55c672a5d8b02eef3255b1b1f47cce019ae1469341"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.058501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.080605 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.091819 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.092474 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.592455965 +0000 UTC m=+143.174927811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.106644 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" event={"ID":"225d1d1d-8168-4489-af91-6a87f28c39ed","Type":"ContainerStarted","Data":"eb1d22c22a294b7deb745d7b54826225299ad5427a1bbcf15e543edc5f3dcc2d"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.106679 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" event={"ID":"225d1d1d-8168-4489-af91-6a87f28c39ed","Type":"ContainerStarted","Data":"979988254c99b59b69c15f468591a0b229b094bd2ae9aa467a9aa5dbc5efbaaa"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.108694 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" event={"ID":"1d0fd4e8-ba15-4d2f-9602-e887819ea423","Type":"ContainerStarted","Data":"7978d084b8b8a0d04cf89ebef1ecc3a95cb54ae2d704f90d8c3142f909490fc2"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.111553 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" event={"ID":"f8161f87-3814-4d02-84ff-b94b8b05c59e","Type":"ContainerStarted","Data":"5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.114972 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" event={"ID":"cba11394-4e55-4edc-beec-750bddabc1d0","Type":"ContainerStarted","Data":"018e8f198a3dd5320e311eef6f8370fe38bcba79bb6f1a512897862b6b92b75d"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.169494 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" event={"ID":"68c1fde1-72ce-4ce0-ade8-9c8e7016464c","Type":"ContainerStarted","Data":"84c2c333f020a07a49798fe3eb6487df75a4cccc90dae7ecdb0347cd6f11a48f"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.169551 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" event={"ID":"68c1fde1-72ce-4ce0-ade8-9c8e7016464c","Type":"ContainerStarted","Data":"e8d37a6a57fda0f8fde14fa0441e67ccd1acf7446ab5259542d4c716ddc9be67"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.169577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" event={"ID":"d962f6fe-d955-483d-b149-976a11dd4922","Type":"ContainerStarted","Data":"aae3072d24cd492e29e371f694952c0b1ae60073bfa2edb5ff693cf52c8c575b"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.169627 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" event={"ID":"d962f6fe-d955-483d-b149-976a11dd4922","Type":"ContainerStarted","Data":"9b0c121e4cc8abab256f58ac54535860c0c99b918fb4ea374e13156c3f11b3ae"} Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.197315 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.197520 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.697476542 +0000 UTC m=+143.279948388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.198311 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.200240 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.700219049 +0000 UTC m=+143.282690885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.221504 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.311944 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.312340 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.812322221 +0000 UTC m=+143.394794067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.374661 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.436059 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.436540 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:31.936520875 +0000 UTC m=+143.518992721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.540442 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.540609 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.040568581 +0000 UTC m=+143.623040427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.540880 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.541197 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.041188581 +0000 UTC m=+143.623660427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: W0220 09:57:31.607034 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3721fc4d_6f04_458e_a74c_0fe816908414.slice/crio-5f5a9d6123b955c36501b6acb39515b93e5b361ffb3adc8ed4ec1d86b16f075f WatchSource:0}: Error finding container 5f5a9d6123b955c36501b6acb39515b93e5b361ffb3adc8ed4ec1d86b16f075f: Status 404 returned error can't find the container with id 5f5a9d6123b955c36501b6acb39515b93e5b361ffb3adc8ed4ec1d86b16f075f Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.647639 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.648077 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.148061437 +0000 UTC m=+143.730533283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.693544 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tv8j9" podStartSLOduration=122.693529528 podStartE2EDuration="2m2.693529528s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:31.652636682 +0000 UTC m=+143.235108528" watchObservedRunningTime="2026-02-20 09:57:31.693529528 +0000 UTC m=+143.276001374" Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.697436 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.748974 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.750296 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.250283005 +0000 UTC m=+143.832754851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.821991 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.826488 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn"] Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.851194 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.851611 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.351582794 +0000 UTC m=+143.934054640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: I0220 09:57:31.952662 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:31 crc kubenswrapper[4962]: E0220 09:57:31.953515 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.453496663 +0000 UTC m=+144.035968519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:31 crc kubenswrapper[4962]: W0220 09:57:31.960412 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90b20e7_a8bc_4b8d_b407_f4f31fc96528.slice/crio-61d226a59e4a534c419d5ea1f4e66a3ee74e9b9f143032d000d2437cc6618af5 WatchSource:0}: Error finding container 61d226a59e4a534c419d5ea1f4e66a3ee74e9b9f143032d000d2437cc6618af5: Status 404 returned error can't find the container with id 61d226a59e4a534c419d5ea1f4e66a3ee74e9b9f143032d000d2437cc6618af5 Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.054614 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.054762 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.554734591 +0000 UTC m=+144.137206437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.054992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.055343 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.555326029 +0000 UTC m=+144.137797875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.156716 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.157815 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.657795536 +0000 UTC m=+144.240267382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.200492 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" event={"ID":"75c3ba8d-4548-4407-9188-a785ef05da2c","Type":"ContainerStarted","Data":"1825648d2577639b0eef3e6e3d827d473411acb221b78700035f5d33dd279261"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.204132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" event={"ID":"68c1fde1-72ce-4ce0-ade8-9c8e7016464c","Type":"ContainerStarted","Data":"d30ea5fb486c0fc821ae6a1b6ba524bb01c9c8f724e754fe5a584a0d1c8fe783"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.210564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" event={"ID":"474b1e5d-9a6f-4931-be66-8fb20c82ac60","Type":"ContainerStarted","Data":"8804485d7428611684c404d2b65ab54f06c494819854af7f13ec9e737899d467"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.211954 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-k85np" event={"ID":"dce6ddda-3fcf-40bd-a085-a09f0bb811bf","Type":"ContainerStarted","Data":"1a56689b77c2b8e93534ee778609bb99633a7ca90b3cee7a24f40467bc915ef2"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.212731 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.215288 4962 patch_prober.go:28] interesting pod/console-operator-58897d9998-k85np container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.215395 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-k85np" podUID="dce6ddda-3fcf-40bd-a085-a09f0bb811bf" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.231054 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" event={"ID":"b55a13cf-03c6-46d9-b286-960a839b1558","Type":"ContainerStarted","Data":"3119b7ebea6326ada31bb463c372a5d9f7a6209ab5ce0c48cc94342a9d2942e5"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.253294 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" podStartSLOduration=123.253275631 podStartE2EDuration="2m3.253275631s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.242834949 +0000 UTC m=+143.825306795" watchObservedRunningTime="2026-02-20 09:57:32.253275631 +0000 UTC m=+143.835747477" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.258814 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.259064 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.759053374 +0000 UTC m=+144.341525210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.275143 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8t82g" podStartSLOduration=122.275128223 podStartE2EDuration="2m2.275128223s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.274058049 +0000 UTC m=+143.856529895" watchObservedRunningTime="2026-02-20 09:57:32.275128223 +0000 UTC m=+143.857600069" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.277097 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" event={"ID":"37e7b911-da73-4f82-ad0c-d8707547b7a7","Type":"ContainerStarted","Data":"cd56a03ac3d90ac4add9269473ce5a604989c53b0a0255f1569b9bc0f7ebe2fb"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.299445 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" event={"ID":"7c85c4ba-4bcb-4449-bd63-320f2ff6a116","Type":"ContainerStarted","Data":"e8bddac0932763828dcb0af23a368dc6c942d28566917752b05ca0c3dddcdfdc"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.302049 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" event={"ID":"ac7d606a-36a8-4608-918c-ed88eaf93a6d","Type":"ContainerStarted","Data":"ccd0adf9d96a51dc5147431e994c9a007279956cfb3aa6b6ea7ae8ddbd115871"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.303499 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tcwqj" event={"ID":"7e5e4942-63be-4811-8aaa-d6b53a427541","Type":"ContainerStarted","Data":"a375dfaf77c6ef6694cbb1091ae581f3820131e5754051883f0ac4cf32012273"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.306206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l92fq" event={"ID":"32025b2b-9232-449f-b7bc-582d81d76430","Type":"ContainerStarted","Data":"8d13176361d41e90849a4d2ef515174dffaf593bc753444cb3080b0a89087860"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.327835 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gqbxv" podStartSLOduration=123.327806902 podStartE2EDuration="2m3.327806902s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.311171515 +0000 UTC m=+143.893643351" watchObservedRunningTime="2026-02-20 09:57:32.327806902 +0000 UTC m=+143.910278748" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.332431 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2"] Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.336224 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" event={"ID":"0e4e18be-a43b-492a-981e-b4f9aebff1ab","Type":"ContainerStarted","Data":"0a30e34f3229c122e5231b38f019a2b8de7af2ab36e53adcd1f2873936254083"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.350619 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" event={"ID":"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1","Type":"ContainerStarted","Data":"d5487650c2e8f26da51d7259619288f7bbcb2d90de5bc8676f9c2455cb9b2574"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.354733 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" event={"ID":"9e86193f-b3bb-42a8-bccb-00e0cbcbf432","Type":"ContainerStarted","Data":"303f7374a29dc9fa9bf4ee1d09da3f40eae761f7dca4a2d598623fbb79e737a0"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.360250 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.361188 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.361439 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.861418456 +0000 UTC m=+144.443890302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.361992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.364072 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.864038989 +0000 UTC m=+144.446519366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.365946 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" event={"ID":"3721fc4d-6f04-458e-a74c-0fe816908414","Type":"ContainerStarted","Data":"5f5a9d6123b955c36501b6acb39515b93e5b361ffb3adc8ed4ec1d86b16f075f"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.367639 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k8pc" podStartSLOduration=123.367626313 podStartE2EDuration="2m3.367626313s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.366855519 +0000 UTC m=+143.949327365" watchObservedRunningTime="2026-02-20 09:57:32.367626313 +0000 UTC m=+143.950098159" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.383950 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" event={"ID":"8da2028c-f296-4f44-b010-b3abec9f6b98","Type":"ContainerStarted","Data":"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.384834 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.386077 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwfk6" event={"ID":"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a","Type":"ContainerStarted","Data":"7a175f5752b9da8dd07abe01e0077ca08911cfa3fb3fa2f627ad42bdc14904eb"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.393309 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" event={"ID":"34f42578-fcc9-4539-add3-bca8deb6927b","Type":"ContainerStarted","Data":"c85de57c4db8642fcb0546ac917d9542ad8dd88da3285417835c7294f2fdcb4b"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.394644 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" event={"ID":"e60f8ca8-5b2f-4b5c-930f-19caf45014ba","Type":"ContainerStarted","Data":"62421acd7645aecb362856d5c4c72323a0863ab4a460f226e65cec6ba0c4b0ed"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.400708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" event={"ID":"a90b20e7-a8bc-4b8d-b407-f4f31fc96528","Type":"ContainerStarted","Data":"61d226a59e4a534c419d5ea1f4e66a3ee74e9b9f143032d000d2437cc6618af5"} Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.404522 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.404580 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.409623 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf"] Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.462737 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.463299 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.963220542 +0000 UTC m=+144.545692388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.464029 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.467566 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:32.967549909 +0000 UTC m=+144.550021755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.497642 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.523058 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-k85np" podStartSLOduration=123.523040427 podStartE2EDuration="2m3.523040427s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.507555476 +0000 UTC m=+144.090027322" watchObservedRunningTime="2026-02-20 09:57:32.523040427 +0000 UTC m=+144.105512263" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.565543 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.575904 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.07586286 +0000 UTC m=+144.658334706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.601203 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.659533 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" podStartSLOduration=123.659513611 podStartE2EDuration="2m3.659513611s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.659311804 +0000 UTC m=+144.241783650" watchObservedRunningTime="2026-02-20 09:57:32.659513611 +0000 UTC m=+144.241985457" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.671111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.671522 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.17150619 +0000 UTC m=+144.753978026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.742561 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hc9h5" podStartSLOduration=122.742545041 podStartE2EDuration="2m2.742545041s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:32.738872834 +0000 UTC m=+144.321344680" watchObservedRunningTime="2026-02-20 09:57:32.742545041 +0000 UTC m=+144.325016887" Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.776537 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.777011 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.276988002 +0000 UTC m=+144.859459848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.881116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.881826 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.381809023 +0000 UTC m=+144.964280859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.983193 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:32 crc kubenswrapper[4962]: E0220 09:57:32.983838 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.483815345 +0000 UTC m=+145.066287191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:32 crc kubenswrapper[4962]: I0220 09:57:32.997875 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7tj4j"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.015459 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hqbh2" podStartSLOduration=123.015432176 podStartE2EDuration="2m3.015432176s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.013793444 +0000 UTC m=+144.596265290" watchObservedRunningTime="2026-02-20 09:57:33.015432176 +0000 UTC m=+144.597904022" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.037351 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.060185 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4mw9f"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.073247 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7nh4t"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.081756 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" podStartSLOduration=123.081730457 podStartE2EDuration="2m3.081730457s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.077225853 +0000 UTC m=+144.659697699" watchObservedRunningTime="2026-02-20 09:57:33.081730457 +0000 UTC m=+144.664202303" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.099469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.100145 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.600126349 +0000 UTC m=+145.182598195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.105995 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7q8sx"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.109925 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lbvml"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.109971 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.112835 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.117438 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-txw2z" podStartSLOduration=124.117420987 podStartE2EDuration="2m4.117420987s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.11496364 +0000 UTC m=+144.697435486" watchObservedRunningTime="2026-02-20 09:57:33.117420987 +0000 UTC m=+144.699892833" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.121578 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.165244 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ckmh2"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.200151 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.201527 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.701497131 +0000 UTC m=+145.283968977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.201631 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.202098 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.702077569 +0000 UTC m=+145.284549415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.207088 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm"] Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.303249 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.303489 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.803462282 +0000 UTC m=+145.385934128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.303574 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.303978 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.803965367 +0000 UTC m=+145.386437213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.404791 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.405201 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.905156953 +0000 UTC m=+145.487628799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.405490 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.406937 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:33.906928349 +0000 UTC m=+145.489400195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.426343 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" event={"ID":"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800","Type":"ContainerStarted","Data":"443b3cac7afd3608f314f45094307c5aed4aec3bb9bcf251c449de2e6b14bd36"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.430752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" event={"ID":"c3febff6-f15f-4ce8-825c-37d86b13c56d","Type":"ContainerStarted","Data":"f2f1692cd20ad2c1f8b5bc92645335660fd9045760c4ee971c36e503d0c5cc41"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.435735 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" event={"ID":"a7a9fa76-da75-4847-a539-d1e6bb57da98","Type":"ContainerStarted","Data":"1e420be70bfd095b03c94c8eabb6da93d1778cb968c541a4d24f98c60cf0a857"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.440036 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"890ef7d22eed2deafad0de9e66c59e509bb9078677f393ebf15cc4bb08f0bf11"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.441306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" event={"ID":"0240e440-4be2-4607-99c4-636b65e78081","Type":"ContainerStarted","Data":"258bdae986515b2708b3f343587de68829ab40bdb5d44138b42c82fc1fad25f3"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.447579 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" event={"ID":"bfd57a5c-0892-46a0-8005-0a8f70c146fd","Type":"ContainerStarted","Data":"c81440f2bd45daadf6efa1fe9a3de8fa8cfa794ff12c8106c2aad73b69faa130"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.465994 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" event={"ID":"1d0fd4e8-ba15-4d2f-9602-e887819ea423","Type":"ContainerStarted","Data":"653c2aac4092a0a1999cbb7055660e2eec23fd820e1d9df6c3ac9693bb4d11cc"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.466938 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" event={"ID":"daef1622-b612-4661-bb6a-63c5997d9a07","Type":"ContainerStarted","Data":"6a1d062b6193d4c5c8f9d78717ac691b2c631795050bcf3ba212d233beabe607"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.468080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" event={"ID":"674f40ed-74ed-48c2-8036-087ce9e16c94","Type":"ContainerStarted","Data":"139c754b94229153f4b84281a788404da840094bb1d0a105262d877cf8cb5a0c"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.475864 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tcwqj" event={"ID":"7e5e4942-63be-4811-8aaa-d6b53a427541","Type":"ContainerStarted","Data":"d3889e2e711be3549254cc518553b9bfd0b57367648743f0a855f2983147e2f3"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.478306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" event={"ID":"5df91f4a-70e8-4036-8ab1-d917af6c8aa4","Type":"ContainerStarted","Data":"ddded4e705c2b3fd75ddf8541ef962e4a3ecd989a60a84d48a40e7b20b9a8421"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.492489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4mw9f" event={"ID":"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c","Type":"ContainerStarted","Data":"8b93339f1d51a9d0cb8f225cdc9c09e5eb5a504bc88354f5124e186878dacb81"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.499571 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.499648 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.506669 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.507227 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.007207316 +0000 UTC m=+145.589679162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.508477 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tcwqj" podStartSLOduration=124.508455295 podStartE2EDuration="2m4.508455295s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.506604927 +0000 UTC m=+145.089076793" watchObservedRunningTime="2026-02-20 09:57:33.508455295 +0000 UTC m=+145.090927141" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.509350 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" event={"ID":"a3652dbd-dae4-462b-be88-b8a782de8a1c","Type":"ContainerStarted","Data":"9cb82e35bd3f3d4fb0aaea07b6c95b315a4035aa82123595066c03e2ec02bbc3"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.515721 4962 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jtftl container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]log ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]etcd ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/generic-apiserver-start-informers ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/max-in-flight-filter ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 20 09:57:33 crc kubenswrapper[4962]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 20 09:57:33 crc kubenswrapper[4962]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/project.openshift.io-projectcache ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/openshift.io-startinformers ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 20 09:57:33 crc kubenswrapper[4962]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 20 09:57:33 crc kubenswrapper[4962]: livez check failed Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.515775 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" podUID="37e7b911-da73-4f82-ad0c-d8707547b7a7" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.517376 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" event={"ID":"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b","Type":"ContainerStarted","Data":"cbec25d696c6d5ff8e5be0198c3b52568e7c4618accc34f427c471862e581419"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.521879 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" event={"ID":"9c993e86-3068-4d07-84b3-655f8308b7ed","Type":"ContainerStarted","Data":"b980330296027c540ad89facdfc9f0e13573c200a01374ad863e804b57231972"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.522643 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7tj4j" event={"ID":"319cf696-9a12-40dc-9f4a-d80fab9a97f8","Type":"ContainerStarted","Data":"d2d3b1d0890dcecf34dcf952d3f23eda2c6d26d1faf52514c85aa14b5ab28740"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.528982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" event={"ID":"b55a13cf-03c6-46d9-b286-960a839b1558","Type":"ContainerStarted","Data":"e3edf1e340c556e0c6a8d4c9a2675f28f20603127816955312d7521dd57cf9a4"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.533040 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.533094 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.535004 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" event={"ID":"9e86193f-b3bb-42a8-bccb-00e0cbcbf432","Type":"ContainerStarted","Data":"08df1ea0ff1879cfe156ee1a1acd0b7b9313f8fdf8c1bda0d53f7d3ee0a797f1"} Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.554308 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-tp9zq" podStartSLOduration=124.554285897 podStartE2EDuration="2m4.554285897s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:33.552922534 +0000 UTC m=+145.135394380" watchObservedRunningTime="2026-02-20 09:57:33.554285897 +0000 UTC m=+145.136757753" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.609543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.611388 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.111361996 +0000 UTC m=+145.693833842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.711763 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.712127 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.212113068 +0000 UTC m=+145.794584914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.848988 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.850216 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.350198352 +0000 UTC m=+145.932670188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.953723 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:33 crc kubenswrapper[4962]: E0220 09:57:33.955434 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.455415616 +0000 UTC m=+146.037887462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.961338 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.963174 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 20 09:57:33 crc kubenswrapper[4962]: I0220 09:57:33.963247 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.058299 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.059133 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.559117691 +0000 UTC m=+146.141589537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.159572 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.160428 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.660386469 +0000 UTC m=+146.242858315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.160633 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.160966 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.660951797 +0000 UTC m=+146.243423643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.262552 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.263825 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.763805036 +0000 UTC m=+146.346276882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.365363 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.365683 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.865669893 +0000 UTC m=+146.448141729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.471097 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.471377 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.97132795 +0000 UTC m=+146.553799796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.471857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.472171 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:34.972156776 +0000 UTC m=+146.554628622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.484733 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-k85np" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.577008 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.577740 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.07771632 +0000 UTC m=+146.660188166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.582353 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" event={"ID":"a3652dbd-dae4-462b-be88-b8a782de8a1c","Type":"ContainerStarted","Data":"e8531bc42f535f5fdb200b255f9b4197b26b17fb00311859ea4571ac343f8767"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.618624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" event={"ID":"77ff4d6a-8c1e-440f-a78c-900c09587848","Type":"ContainerStarted","Data":"5c1664ea6d1342cde3ee939a67b0d6fab903a6ff0752970679026e989c5b22da"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.630347 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" podStartSLOduration=125.630329278 podStartE2EDuration="2m5.630329278s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.628728496 +0000 UTC m=+146.211200342" watchObservedRunningTime="2026-02-20 09:57:34.630329278 +0000 UTC m=+146.212801114" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.658827 4962 generic.go:334] "Generic (PLEG): container finished" podID="0e4e18be-a43b-492a-981e-b4f9aebff1ab" containerID="30029afbf222fa2dec68b5070c70cd75e24e973f608bf3f26517175df7acbb18" exitCode=0 Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.658903 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" event={"ID":"0e4e18be-a43b-492a-981e-b4f9aebff1ab","Type":"ContainerDied","Data":"30029afbf222fa2dec68b5070c70cd75e24e973f608bf3f26517175df7acbb18"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.685447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.685766 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.185754063 +0000 UTC m=+146.768225909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.704192 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" event={"ID":"3721fc4d-6f04-458e-a74c-0fe816908414","Type":"ContainerStarted","Data":"2bdfa323009c4f4a85d15bb613412a29a700169e00232fe659c46b40638f9a93"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.725163 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7tj4j" event={"ID":"319cf696-9a12-40dc-9f4a-d80fab9a97f8","Type":"ContainerStarted","Data":"815f1756cdaba49e706bd4304aad70e20e24ade6e7f969af3b41c9f3d5767389"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.731944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwfk6" event={"ID":"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a","Type":"ContainerStarted","Data":"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.772634 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" event={"ID":"bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800","Type":"ContainerStarted","Data":"6847c2056385d4b2dd02a42a9a9d446c50c08eafa2f1e6cc3d0c3558968bff47"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.772997 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.783109 4962 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-g6nc2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.783169 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" podUID="bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.786667 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.787315 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.287294561 +0000 UTC m=+146.869766407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.788384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" event={"ID":"474b1e5d-9a6f-4931-be66-8fb20c82ac60","Type":"ContainerStarted","Data":"fd8fc214b400152a5477878bb64f315d0e48dee95f1111925643f740c4d287b9"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.790795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" event={"ID":"b55a13cf-03c6-46d9-b286-960a839b1558","Type":"ContainerStarted","Data":"a2337c63c0d070eb3f42eff63f1912b6c6c0c51643c6c305ca770b392ac7df8a"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.792068 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" event={"ID":"c3febff6-f15f-4ce8-825c-37d86b13c56d","Type":"ContainerStarted","Data":"57b3e3f27f28e048385a642c6e636092605d8051a0ecf5d01b844ae14fcbcd98"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.793013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" event={"ID":"daef1622-b612-4661-bb6a-63c5997d9a07","Type":"ContainerStarted","Data":"623f5a83713eae9ae66915a5f02b66fb1c906e36ace1bfe89b27459ea8063b58"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.793635 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.803715 4962 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rsf8j container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.803774 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" podUID="daef1622-b612-4661-bb6a-63c5997d9a07" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.822817 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" event={"ID":"0240e440-4be2-4607-99c4-636b65e78081","Type":"ContainerStarted","Data":"2fd6ed07b48189abc8523e249d3b188bdfe101528afc68686b75589efb8e5e19"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.835402 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" event={"ID":"bfd57a5c-0892-46a0-8005-0a8f70c146fd","Type":"ContainerStarted","Data":"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.836258 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.844694 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7z5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.844725 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.846781 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" event={"ID":"a90b20e7-a8bc-4b8d-b407-f4f31fc96528","Type":"ContainerStarted","Data":"a99251a9a36d63a1de6dacdab4d78da084782809d03e524d2845e8f4dddfee56"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.865417 4962 generic.go:334] "Generic (PLEG): container finished" podID="cba11394-4e55-4edc-beec-750bddabc1d0" containerID="0bca8520e1be4b3ef76bf6c2a482d5c0ab095f3e2903c5da2f7a07119447c61a" exitCode=0 Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.865510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" event={"ID":"cba11394-4e55-4edc-beec-750bddabc1d0","Type":"ContainerDied","Data":"0bca8520e1be4b3ef76bf6c2a482d5c0ab095f3e2903c5da2f7a07119447c61a"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.887840 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nwfk6" podStartSLOduration=125.887818945 podStartE2EDuration="2m5.887818945s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.881799164 +0000 UTC m=+146.464271010" watchObservedRunningTime="2026-02-20 09:57:34.887818945 +0000 UTC m=+146.470290791" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.888044 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" podStartSLOduration=124.888039612 podStartE2EDuration="2m4.888039612s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.768380271 +0000 UTC m=+146.350852117" watchObservedRunningTime="2026-02-20 09:57:34.888039612 +0000 UTC m=+146.470511458" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.888701 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.890563 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" event={"ID":"34f42578-fcc9-4539-add3-bca8deb6927b","Type":"ContainerStarted","Data":"9723cce7b81e12ca5ae6a0cf324a742ef329141e85ac2bbb0e1234b2b48301c9"} Feb 20 09:57:34 crc kubenswrapper[4962]: E0220 09:57:34.891856 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.391842662 +0000 UTC m=+146.974314508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.914235 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7tj4j" podStartSLOduration=7.91420019 podStartE2EDuration="7.91420019s" podCreationTimestamp="2026-02-20 09:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.912745305 +0000 UTC m=+146.495217151" watchObservedRunningTime="2026-02-20 09:57:34.91420019 +0000 UTC m=+146.496672036" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.920571 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" event={"ID":"7c85c4ba-4bcb-4449-bd63-320f2ff6a116","Type":"ContainerStarted","Data":"44b500957f3dd40a8c062d7793bb38ee7f4e90c96eec1518e1be022170a3ed13"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.939807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" event={"ID":"f8161f87-3814-4d02-84ff-b94b8b05c59e","Type":"ContainerStarted","Data":"7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.940831 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.947948 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" event={"ID":"9c993e86-3068-4d07-84b3-655f8308b7ed","Type":"ContainerStarted","Data":"5e4ea3b700baf309053aa9ba3f2593cde7c92ddcbb20f78cab4d07dec31a031e"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.949027 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" event={"ID":"cb32a62f-c8c5-40a6-9c7f-e456c68bf7c1","Type":"ContainerStarted","Data":"fde666550b488c9bfb71c2f2d3ec18c4adb04ae0eed789c39e4bfeac488c04fb"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.951043 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-l92fq" event={"ID":"32025b2b-9232-449f-b7bc-582d81d76430","Type":"ContainerStarted","Data":"782c84f5d1345ec0f04dc5e87e15446209ef1f59e146f06988c620f04d7c356e"} Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.954261 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5lf26" podStartSLOduration=125.954243319 podStartE2EDuration="2m5.954243319s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:34.946007108 +0000 UTC m=+146.528478954" watchObservedRunningTime="2026-02-20 09:57:34.954243319 +0000 UTC m=+146.536715165" Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.961378 4962 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mrzbm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 20 09:57:34 crc kubenswrapper[4962]: I0220 09:57:34.961436 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.004787 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.017701 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:35 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:35 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:35 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.017785 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.018792 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.518774304 +0000 UTC m=+147.101246150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.051255 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dc74p" podStartSLOduration=125.051229821 podStartE2EDuration="2m5.051229821s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.003617273 +0000 UTC m=+146.586089119" watchObservedRunningTime="2026-02-20 09:57:35.051229821 +0000 UTC m=+146.633701667" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.079265 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" podStartSLOduration=125.079228948 podStartE2EDuration="2m5.079228948s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.049519338 +0000 UTC m=+146.631991194" watchObservedRunningTime="2026-02-20 09:57:35.079228948 +0000 UTC m=+146.661700794" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.102723 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7q8sx" podStartSLOduration=125.102699472 podStartE2EDuration="2m5.102699472s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.101021989 +0000 UTC m=+146.683493835" watchObservedRunningTime="2026-02-20 09:57:35.102699472 +0000 UTC m=+146.685171318" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.132246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.138047 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.638023691 +0000 UTC m=+147.220495757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.151843 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8jt7t" podStartSLOduration=126.151802828 podStartE2EDuration="2m6.151802828s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.149005289 +0000 UTC m=+146.731477135" watchObservedRunningTime="2026-02-20 09:57:35.151802828 +0000 UTC m=+146.734274674" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.194562 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" podStartSLOduration=125.194538402 podStartE2EDuration="2m5.194538402s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.186360413 +0000 UTC m=+146.768832259" watchObservedRunningTime="2026-02-20 09:57:35.194538402 +0000 UTC m=+146.777010248" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.199658 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.200460 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.200662 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.204007 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.204043 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.223079 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podStartSLOduration=125.223035075 podStartE2EDuration="2m5.223035075s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.219452381 +0000 UTC m=+146.801924227" watchObservedRunningTime="2026-02-20 09:57:35.223035075 +0000 UTC m=+146.805506921" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.233261 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.233686 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.733669821 +0000 UTC m=+147.316141667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.333450 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v2nvr" podStartSLOduration=125.333421051 podStartE2EDuration="2m5.333421051s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.279970869 +0000 UTC m=+146.862442715" watchObservedRunningTime="2026-02-20 09:57:35.333421051 +0000 UTC m=+146.915892897" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.336502 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.336538 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.336613 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.337633 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.837616725 +0000 UTC m=+147.420088571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.383300 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxdr9" podStartSLOduration=126.383275001 podStartE2EDuration="2m6.383275001s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.33809508 +0000 UTC m=+146.920566926" watchObservedRunningTime="2026-02-20 09:57:35.383275001 +0000 UTC m=+146.965746847" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.409939 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" podStartSLOduration=126.409914655 podStartE2EDuration="2m6.409914655s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.385163761 +0000 UTC m=+146.967635607" watchObservedRunningTime="2026-02-20 09:57:35.409914655 +0000 UTC m=+146.992386501" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443164 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bmp44" podStartSLOduration=126.443145808 podStartE2EDuration="2m6.443145808s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.441910269 +0000 UTC m=+147.024382115" watchObservedRunningTime="2026-02-20 09:57:35.443145808 +0000 UTC m=+147.025617654" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443308 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443518 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.443766 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.443857 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:35.94383886 +0000 UTC m=+147.526310706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.471654 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.471928 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-l92fq" podStartSLOduration=8.471902919 podStartE2EDuration="8.471902919s" podCreationTimestamp="2026-02-20 09:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:35.466312272 +0000 UTC m=+147.048784118" watchObservedRunningTime="2026-02-20 09:57:35.471902919 +0000 UTC m=+147.054374765" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.542174 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.545367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.545850 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.045826701 +0000 UTC m=+147.628298717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.646702 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.647008 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.146957324 +0000 UTC m=+147.729429170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.647850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.648283 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.148263166 +0000 UTC m=+147.730735012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.748803 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.749059 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.249012168 +0000 UTC m=+147.831484014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.749153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.749687 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.249671469 +0000 UTC m=+147.832143315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.850633 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.851069 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.351047081 +0000 UTC m=+147.933518927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.952280 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:35 crc kubenswrapper[4962]: E0220 09:57:35.952793 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.452768233 +0000 UTC m=+148.035240069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.965888 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:35 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:35 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:35 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.965985 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.978541 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.979254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"2d835a5c085a62d70bc8de6c2cbbf96b10458ff46ab6c5eb78e948e87cbfd48a"} Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.983332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wd68v" event={"ID":"3721fc4d-6f04-458e-a74c-0fe816908414","Type":"ContainerStarted","Data":"fc0a665ba0187f4519015f863c75c4e7b1fc1d7e9d8e5acbdc182b234426711b"} Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.986008 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" event={"ID":"1d0fd4e8-ba15-4d2f-9602-e887819ea423","Type":"ContainerStarted","Data":"f52f197f1e495c3db1e3057530f87d97247fe176ad48fc7d93be58c3826b76cc"} Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.990821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" event={"ID":"5df91f4a-70e8-4036-8ab1-d917af6c8aa4","Type":"ContainerStarted","Data":"8507e85e72f1b098bf2eb294ee7cb575a3e07472243bb92e708a75b27f238ef6"} Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.991668 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.994981 4962 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-965lm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.995030 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" podUID="5df91f4a-70e8-4036-8ab1-d917af6c8aa4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Feb 20 09:57:35 crc kubenswrapper[4962]: I0220 09:57:35.998970 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" event={"ID":"cba11394-4e55-4edc-beec-750bddabc1d0","Type":"ContainerStarted","Data":"bf64eb1191de2992ecf1fe7024d9c2fb8f001c7596f96f6b6660c08cf585d672"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.008333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4mw9f" event={"ID":"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c","Type":"ContainerStarted","Data":"9c23099cf62e1856118f9e256d60bc14a4eaabb786daaaefefdfbc09d7272bed"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.008374 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4mw9f" event={"ID":"b9e0a083-b7e8-4b81-ad1a-03f587f2f46c","Type":"ContainerStarted","Data":"c8b290a446b1a0ff68897fb714b64c2c886fd7c064948346afa4d53169d5090b"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.008777 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.013162 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gqpsl" podStartSLOduration=127.013142316 podStartE2EDuration="2m7.013142316s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.012374261 +0000 UTC m=+147.594846117" watchObservedRunningTime="2026-02-20 09:57:36.013142316 +0000 UTC m=+147.595614162" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.015430 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" event={"ID":"c3febff6-f15f-4ce8-825c-37d86b13c56d","Type":"ContainerStarted","Data":"26c1d96111b69416936f5ec08aca7b6b3d82f231dcc1fb41a30595d176a2b78a"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.030520 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" event={"ID":"a7a9fa76-da75-4847-a539-d1e6bb57da98","Type":"ContainerStarted","Data":"1a98b0ae0814bfea7cc97db524565b02c5a34c29f92af1c6afdbfecf632676a2"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.030621 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" event={"ID":"a7a9fa76-da75-4847-a539-d1e6bb57da98","Type":"ContainerStarted","Data":"869b047e6d8c83815379af5cc852e044730609a76341d85f68e1eaacce78d4fd"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.035535 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" podStartSLOduration=126.035519245 podStartE2EDuration="2m6.035519245s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.034814972 +0000 UTC m=+147.617286818" watchObservedRunningTime="2026-02-20 09:57:36.035519245 +0000 UTC m=+147.617991091" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.039967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" event={"ID":"674f40ed-74ed-48c2-8036-087ce9e16c94","Type":"ContainerStarted","Data":"9cffb7dc2af2b1a98935ee59a85ac9e57c1c54d7fb4bb9a7c80dd05dece660ce"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.043639 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" event={"ID":"77ff4d6a-8c1e-440f-a78c-900c09587848","Type":"ContainerStarted","Data":"90810cb9fd7bdb3cb8ae0a1a0443445cc9e7ecae14c9e37dea5779b65a7767d0"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054153 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.054378 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.554337611 +0000 UTC m=+148.136809457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054459 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054631 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054861 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.054902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.057926 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.557908984 +0000 UTC m=+148.140380830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.059273 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.063767 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" event={"ID":"9c993e86-3068-4d07-84b3-655f8308b7ed","Type":"ContainerStarted","Data":"51741528a20919ae4db90e7b195e0c13fb3633bd047b993855bd1644447145bc"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.064852 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.064880 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.065161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.070747 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.071236 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4mw9f" podStartSLOduration=9.071222576 podStartE2EDuration="9.071222576s" podCreationTimestamp="2026-02-20 09:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.07072203 +0000 UTC m=+147.653193876" watchObservedRunningTime="2026-02-20 09:57:36.071222576 +0000 UTC m=+147.653694422" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.079357 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" event={"ID":"28c85fa3-b7b5-4fbc-9afb-fa3eb6b8a75b","Type":"ContainerStarted","Data":"6df0b4009565b100e9117e0b6f2259dcd8593ad7d6d9ab74f016f684565c1aac"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.099760 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" event={"ID":"0e4e18be-a43b-492a-981e-b4f9aebff1ab","Type":"ContainerStarted","Data":"9bc76df3e93c4ee55a39198dbe10bbace1254f7e66e141cd72eacdb0c8f0ebe4"} Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.102329 4962 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rsf8j container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.102382 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" podUID="daef1622-b612-4661-bb6a-63c5997d9a07" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.102638 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7z5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.102721 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.103696 4962 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-g6nc2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.103763 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" podUID="bfc99ac4-00ed-48bf-b95e-fcdd4e6e0800" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.114574 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" podStartSLOduration=126.114544458 podStartE2EDuration="2m6.114544458s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.112207234 +0000 UTC m=+147.694679090" watchObservedRunningTime="2026-02-20 09:57:36.114544458 +0000 UTC m=+147.697016304" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.156044 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.156531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.160611 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.660573817 +0000 UTC m=+148.243045663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.187498 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wndb7" podStartSLOduration=126.187480279 podStartE2EDuration="2m6.187480279s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.171305296 +0000 UTC m=+147.753777142" watchObservedRunningTime="2026-02-20 09:57:36.187480279 +0000 UTC m=+147.769952125" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.188183 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.200783 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5nqkf" podStartSLOduration=127.200710258 podStartE2EDuration="2m7.200710258s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.13921106 +0000 UTC m=+147.721682906" watchObservedRunningTime="2026-02-20 09:57:36.200710258 +0000 UTC m=+147.783182114" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.230209 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" podStartSLOduration=126.230190962 podStartE2EDuration="2m6.230190962s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.230017276 +0000 UTC m=+147.812489122" watchObservedRunningTime="2026-02-20 09:57:36.230190962 +0000 UTC m=+147.812662808" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.253332 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ckmh2" podStartSLOduration=126.253311235 podStartE2EDuration="2m6.253311235s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.250041821 +0000 UTC m=+147.832513667" watchObservedRunningTime="2026-02-20 09:57:36.253311235 +0000 UTC m=+147.835783081" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.259224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.266258 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.766240284 +0000 UTC m=+148.348712130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.281304 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" podStartSLOduration=127.28127136 podStartE2EDuration="2m7.28127136s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.28126306 +0000 UTC m=+147.863734906" watchObservedRunningTime="2026-02-20 09:57:36.28127136 +0000 UTC m=+147.863743206" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.393697 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.400182 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.400376 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.400816 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:36.900794487 +0000 UTC m=+148.483266333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.460064 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6dwbb" podStartSLOduration=126.460047484 podStartE2EDuration="2m6.460047484s" podCreationTimestamp="2026-02-20 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.332002558 +0000 UTC m=+147.914474404" watchObservedRunningTime="2026-02-20 09:57:36.460047484 +0000 UTC m=+148.042519330" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.477063 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.502368 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.502885 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.002868641 +0000 UTC m=+148.585340487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.580027 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.604578 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.604784 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.104742008 +0000 UTC m=+148.687213854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.604902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.605225 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.105190642 +0000 UTC m=+148.687662488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.649268 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lbvml" podStartSLOduration=127.649249298 podStartE2EDuration="2m7.649249298s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:36.463313458 +0000 UTC m=+148.045785304" watchObservedRunningTime="2026-02-20 09:57:36.649249298 +0000 UTC m=+148.231721144" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.706431 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.706835 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.206820452 +0000 UTC m=+148.789292298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.808162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.808622 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.308586767 +0000 UTC m=+148.891058613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.908939 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.909129 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.409101651 +0000 UTC m=+148.991573497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.909796 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:36 crc kubenswrapper[4962]: E0220 09:57:36.910190 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.410181685 +0000 UTC m=+148.992653591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.970287 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:36 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:36 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:36 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.970343 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:36 crc kubenswrapper[4962]: I0220 09:57:36.997297 4962 csr.go:261] certificate signing request csr-pmnxg is approved, waiting to be issued Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.011644 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.012136 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.512089393 +0000 UTC m=+149.094561239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.032555 4962 csr.go:257] certificate signing request csr-pmnxg is issued Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.127444 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.128173 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.62815989 +0000 UTC m=+149.210631736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.230000 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.230186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9c04c443e027d8924547d85ee062566809140b12e02a27ec77d3e52915a7c22e"} Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.231314 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.731289748 +0000 UTC m=+149.313761594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.281200 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07ea40e-b9be-4e90-bf7a-293fa009e7d2","Type":"ContainerStarted","Data":"22f0d08c036c0314545671011b2622d5765c7a6621317839ad590b2d4b0ed9d6"} Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.281263 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07ea40e-b9be-4e90-bf7a-293fa009e7d2","Type":"ContainerStarted","Data":"bd7d4127579c5aa2ce40f267a04e89a7814800df62a8bbe88b691f551a516f6e"} Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.331224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.331581 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.831568945 +0000 UTC m=+149.414040791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.355700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"d55b0414e4e28d7057524b2c28fe19f6a5e9804ff735c31b1c129a4e9cad1b7a"} Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.363706 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-m7z5r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.363752 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.438876 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.439453 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:37.939426751 +0000 UTC m=+149.521898597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.461738 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rsf8j" Feb 20 09:57:37 crc kubenswrapper[4962]: W0220 09:57:37.485764 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-23bfa3bade7d80bd6fd7307b1c8515ad0829a51431bd2f2e7505b2fb4105c306 WatchSource:0}: Error finding container 23bfa3bade7d80bd6fd7307b1c8515ad0829a51431bd2f2e7505b2fb4105c306: Status 404 returned error can't find the container with id 23bfa3bade7d80bd6fd7307b1c8515ad0829a51431bd2f2e7505b2fb4105c306 Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.551949 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.557180 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.057163181 +0000 UTC m=+149.639635027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.641481 4962 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.668217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.668509 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.168492798 +0000 UTC m=+149.750964644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.770531 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.771672 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.271646947 +0000 UTC m=+149.854118793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.872574 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.872852 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.372838442 +0000 UTC m=+149.955310288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.963603 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:37 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:37 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:37 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.963672 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:37 crc kubenswrapper[4962]: I0220 09:57:37.973825 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:37 crc kubenswrapper[4962]: E0220 09:57:37.974148 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.474136741 +0000 UTC m=+150.056608587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.034558 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-20 09:52:36 +0000 UTC, rotation deadline is 2026-12-05 20:03:26.332715871 +0000 UTC Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.034631 4962 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6922h5m48.298088693s for next certificate rotation Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.075077 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:38 crc kubenswrapper[4962]: E0220 09:57:38.075265 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.575234834 +0000 UTC m=+150.157706680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.075444 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: E0220 09:57:38.075893 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.575876075 +0000 UTC m=+150.158347921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.176148 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-965lm" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.176643 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:38 crc kubenswrapper[4962]: E0220 09:57:38.177099 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.677075651 +0000 UTC m=+150.259547497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.258368 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.259402 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.265231 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.272930 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.278168 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.278315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.278412 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.278577 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: E0220 09:57:38.278703 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 09:57:38.77868617 +0000 UTC m=+150.361158016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8pks8" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.321696 4962 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-20T09:57:37.641513674Z","Handler":null,"Name":""} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.329747 4962 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.329813 4962 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.371865 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cef9b30fae3cc65222f2b69910474ee9d3e9c6671f9d9b0373e397e9a6c203c7"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.371924 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"23bfa3bade7d80bd6fd7307b1c8515ad0829a51431bd2f2e7505b2fb4105c306"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.372329 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.373737 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5077da1ec905ac4d8b00fc750f998dc9b6753887e8cab94c7a76cf4dafa7d4c2"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.375365 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1f363d461cf8561be841943af5ab4b48d9c3b3fed020a6cc20ea47dad7c65227"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.375427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"382532b5ac93bf8cd62633a3dac7df6ef3102e6029d7bb44b9438f0797fd2596"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.376995 4962 generic.go:334] "Generic (PLEG): container finished" podID="a07ea40e-b9be-4e90-bf7a-293fa009e7d2" containerID="22f0d08c036c0314545671011b2622d5765c7a6621317839ad590b2d4b0ed9d6" exitCode=0 Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.377337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07ea40e-b9be-4e90-bf7a-293fa009e7d2","Type":"ContainerDied","Data":"22f0d08c036c0314545671011b2622d5765c7a6621317839ad590b2d4b0ed9d6"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.379348 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.379771 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.379910 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.380268 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.380423 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.380732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.381155 4962 generic.go:334] "Generic (PLEG): container finished" podID="a3652dbd-dae4-462b-be88-b8a782de8a1c" containerID="e8531bc42f535f5fdb200b255f9b4197b26b17fb00311859ea4571ac343f8767" exitCode=0 Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.381219 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" event={"ID":"a3652dbd-dae4-462b-be88-b8a782de8a1c","Type":"ContainerDied","Data":"e8531bc42f535f5fdb200b255f9b4197b26b17fb00311859ea4571ac343f8767"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.384875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.388196 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"03dc8ce942a5ba94e89de4d324b99e17a76a86be71aae629cd6916c027f856c5"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.388323 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" event={"ID":"8cb06d17-6188-4cca-84b7-f3d03abb20e8","Type":"ContainerStarted","Data":"b37495346e5f2a55c17afe8dc8f3791bc5cb2e2082f2276176b213cf87971aa8"} Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.397665 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xkmtn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.402233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") pod \"certified-operators-sxxjg\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.461771 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.462808 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.465890 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.476829 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.481916 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.482032 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.482227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.482284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.505642 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.505724 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.511705 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.522025 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jtftl" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.532314 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7nh4t" podStartSLOduration=11.532281494 podStartE2EDuration="11.532281494s" podCreationTimestamp="2026-02-20 09:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:38.528233015 +0000 UTC m=+150.110704851" watchObservedRunningTime="2026-02-20 09:57:38.532281494 +0000 UTC m=+150.114753340" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.579178 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.584526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.584582 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.584722 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.586924 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.587494 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.634467 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") pod \"community-operators-6q5bk\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.636850 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8pks8\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.681741 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.682985 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.721775 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.729047 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.779867 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.793558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.793604 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.793670 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.864715 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.865975 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.891363 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895196 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895256 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895312 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895360 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.895393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.896231 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.896469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.934567 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") pod \"certified-operators-5qdmt\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.973843 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:38 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:38 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:38 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.973894 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.996705 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.996778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.996827 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.997535 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:38 crc kubenswrapper[4962]: I0220 09:57:38.997792 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.000254 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.017599 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") pod \"community-operators-d4pzn\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.023700 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.097493 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") pod \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.097924 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") pod \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\" (UID: \"a07ea40e-b9be-4e90-bf7a-293fa009e7d2\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.099175 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a07ea40e-b9be-4e90-bf7a-293fa009e7d2" (UID: "a07ea40e-b9be-4e90-bf7a-293fa009e7d2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.105665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a07ea40e-b9be-4e90-bf7a-293fa009e7d2" (UID: "a07ea40e-b9be-4e90-bf7a-293fa009e7d2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.139606 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 09:57:39 crc kubenswrapper[4962]: W0220 09:57:39.170150 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee660135_f5e2_420e_a242_440471e57da2.slice/crio-ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377 WatchSource:0}: Error finding container ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377: Status 404 returned error can't find the container with id ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377 Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.172773 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.200499 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.200537 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07ea40e-b9be-4e90-bf7a-293fa009e7d2-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.265446 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.295003 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.313177 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.406461 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.422872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" event={"ID":"b4ad1819-20e1-406b-8499-5a73780c0a0c","Type":"ContainerStarted","Data":"5004d974da71f7174ba7d6f42652143c4f7cb0b752e3647e653cb9e55b56d9b3"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.428161 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.428295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07ea40e-b9be-4e90-bf7a-293fa009e7d2","Type":"ContainerDied","Data":"bd7d4127579c5aa2ce40f267a04e89a7814800df62a8bbe88b691f551a516f6e"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.428367 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd7d4127579c5aa2ce40f267a04e89a7814800df62a8bbe88b691f551a516f6e" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.432871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerStarted","Data":"dd70ef6c640a62edc318879e7e0b88b18026337e7b55ef136a0601bdad9e609c"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.435926 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerStarted","Data":"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.435969 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerStarted","Data":"ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377"} Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.600743 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.600875 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.604730 4962 patch_prober.go:28] interesting pod/downloads-7954f5f757-tv8j9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.604757 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tv8j9" podUID="7ef2d9f9-34f2-48a6-83eb-689c0fdcac66" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.690766 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.694792 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.713519 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") pod \"a3652dbd-dae4-462b-be88-b8a782de8a1c\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.713624 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") pod \"a3652dbd-dae4-462b-be88-b8a782de8a1c\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.713651 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") pod \"a3652dbd-dae4-462b-be88-b8a782de8a1c\" (UID: \"a3652dbd-dae4-462b-be88-b8a782de8a1c\") " Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.716271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume" (OuterVolumeSpecName: "config-volume") pod "a3652dbd-dae4-462b-be88-b8a782de8a1c" (UID: "a3652dbd-dae4-462b-be88-b8a782de8a1c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.723622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a3652dbd-dae4-462b-be88-b8a782de8a1c" (UID: "a3652dbd-dae4-462b-be88-b8a782de8a1c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.724794 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5" (OuterVolumeSpecName: "kube-api-access-hcnb5") pod "a3652dbd-dae4-462b-be88-b8a782de8a1c" (UID: "a3652dbd-dae4-462b-be88-b8a782de8a1c"). InnerVolumeSpecName "kube-api-access-hcnb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.817328 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcnb5\" (UniqueName: \"kubernetes.io/projected/a3652dbd-dae4-462b-be88-b8a782de8a1c-kube-api-access-hcnb5\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.817375 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3652dbd-dae4-462b-be88-b8a782de8a1c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.817390 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3652dbd-dae4-462b-be88-b8a782de8a1c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.962241 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.962302 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.964039 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:39 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:39 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:39 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.964085 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:39 crc kubenswrapper[4962]: I0220 09:57:39.969420 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.261663 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 09:57:40 crc kubenswrapper[4962]: E0220 09:57:40.262023 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3652dbd-dae4-462b-be88-b8a782de8a1c" containerName="collect-profiles" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.262043 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3652dbd-dae4-462b-be88-b8a782de8a1c" containerName="collect-profiles" Feb 20 09:57:40 crc kubenswrapper[4962]: E0220 09:57:40.262058 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07ea40e-b9be-4e90-bf7a-293fa009e7d2" containerName="pruner" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.262070 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07ea40e-b9be-4e90-bf7a-293fa009e7d2" containerName="pruner" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.262230 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07ea40e-b9be-4e90-bf7a-293fa009e7d2" containerName="pruner" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.262249 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3652dbd-dae4-462b-be88-b8a782de8a1c" containerName="collect-profiles" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.263436 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.269692 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.277570 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.424259 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.424334 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.424359 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.443046 4962 generic.go:334] "Generic (PLEG): container finished" podID="805f4075-7fda-4a54-882f-c4fd160148a4" containerID="d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4" exitCode=0 Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.443096 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerDied","Data":"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.443153 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerStarted","Data":"75a64fa0799e34c46374b381b4c7c1a53295cecd1a95e7229c96eb57af23d670"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.445204 4962 generic.go:334] "Generic (PLEG): container finished" podID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerID="9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c" exitCode=0 Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.445255 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerDied","Data":"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.445750 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.446971 4962 generic.go:334] "Generic (PLEG): container finished" podID="1c487f78-6735-4114-a45a-6c60ccef5983" containerID="43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25" exitCode=0 Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.447062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerDied","Data":"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.447510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerStarted","Data":"7fcd5743db242f51c1ff9cb31c18720e064f45718743b615c36cf8bb2d39f79d"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.451105 4962 generic.go:334] "Generic (PLEG): container finished" podID="ee660135-f5e2-420e-a242-440471e57da2" containerID="eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c" exitCode=0 Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.451203 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerDied","Data":"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.459709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" event={"ID":"a3652dbd-dae4-462b-be88-b8a782de8a1c","Type":"ContainerDied","Data":"9cb82e35bd3f3d4fb0aaea07b6c95b315a4035aa82123595066c03e2ec02bbc3"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.459761 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb82e35bd3f3d4fb0aaea07b6c95b315a4035aa82123595066c03e2ec02bbc3" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.459844 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.462935 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" event={"ID":"b4ad1819-20e1-406b-8499-5a73780c0a0c","Type":"ContainerStarted","Data":"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46"} Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.463113 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.470019 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vp5tl" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.497470 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" podStartSLOduration=131.497441552 podStartE2EDuration="2m11.497441552s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:57:40.484386819 +0000 UTC m=+152.066858665" watchObservedRunningTime="2026-02-20 09:57:40.497441552 +0000 UTC m=+152.079913408" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.525596 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.525704 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.525804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.526360 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.526429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.544443 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") pod \"redhat-marketplace-x4hxs\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.579300 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.583665 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.583716 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.585954 4962 patch_prober.go:28] interesting pod/console-f9d7485db-nwfk6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.586000 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nwfk6" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.651596 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.653356 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.666297 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.716921 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g6nc2" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.729410 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.840252 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.840627 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.840715 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.941623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.941733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.941755 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.942208 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.942623 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.960600 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.970856 4962 patch_prober.go:28] interesting pod/router-default-5444994796-tcwqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 09:57:40 crc kubenswrapper[4962]: [-]has-synced failed: reason withheld Feb 20 09:57:40 crc kubenswrapper[4962]: [+]process-running ok Feb 20 09:57:40 crc kubenswrapper[4962]: healthz check failed Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.970948 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tcwqj" podUID="7e5e4942-63be-4811-8aaa-d6b53a427541" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 09:57:40 crc kubenswrapper[4962]: I0220 09:57:40.992833 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") pod \"redhat-marketplace-2gxn5\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.045080 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.275436 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.456187 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.457847 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.460565 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.473536 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.484764 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f414667-865d-4c89-b470-50f61a11b60e" containerID="4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf" exitCode=0 Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.486664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerDied","Data":"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf"} Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.486696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerStarted","Data":"4ba52fa324168e2ee08b42cdefbfc041b14744aa5c09a51cbc5628b6f08e9f57"} Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.516971 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.517021 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.657209 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.658163 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.658249 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.760111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.760202 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.760242 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.760999 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.761761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.781690 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") pod \"redhat-operators-zwkjb\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.864831 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.867501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.872899 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.885734 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.958105 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.963508 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.963636 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.963769 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.965342 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:41 crc kubenswrapper[4962]: I0220 09:57:41.968051 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tcwqj" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.065498 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.065568 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.065588 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.066020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.066271 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.087994 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") pod \"redhat-operators-grl4h\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.195914 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.259665 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.485017 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.494406 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerStarted","Data":"1d342c2b45fca81302a1d91ac4539bc994004d9b4834928e5a7a0d50c28cc22c"} Feb 20 09:57:42 crc kubenswrapper[4962]: I0220 09:57:42.496699 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerStarted","Data":"962bcd42638f9814f3627f1f0129094057257d45837d02365bb3acaa7e0e1287"} Feb 20 09:57:42 crc kubenswrapper[4962]: W0220 09:57:42.499498 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4065ac08_9c62_48db_bbfe_9e53ab7d5463.slice/crio-50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601 WatchSource:0}: Error finding container 50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601: Status 404 returned error can't find the container with id 50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601 Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.176078 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.177180 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.181496 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.181921 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.186762 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.289819 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.289898 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.391546 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.391648 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.391783 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.413584 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.514549 4962 generic.go:334] "Generic (PLEG): container finished" podID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerID="7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1" exitCode=0 Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.514645 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerDied","Data":"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1"} Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.522163 4962 generic.go:334] "Generic (PLEG): container finished" podID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerID="8158f54e7f263dc473b3aa7c7a0204ebc0dcdf73eaeb229d1a5efae6eb63d973" exitCode=0 Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.522228 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerDied","Data":"8158f54e7f263dc473b3aa7c7a0204ebc0dcdf73eaeb229d1a5efae6eb63d973"} Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.522254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerStarted","Data":"50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601"} Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.526350 4962 generic.go:334] "Generic (PLEG): container finished" podID="a313fb19-8615-43b7-a19a-df83e50410ba" containerID="67d79612188b2d4ac629f49494e406f3c95c85018ab3966df8e32115cfac1740" exitCode=0 Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.526394 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerDied","Data":"67d79612188b2d4ac629f49494e406f3c95c85018ab3966df8e32115cfac1740"} Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.538306 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:43 crc kubenswrapper[4962]: I0220 09:57:43.885148 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 09:57:43 crc kubenswrapper[4962]: W0220 09:57:43.905975 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4d9655a2_2b08_4827_8126_160f62910b6f.slice/crio-4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa WatchSource:0}: Error finding container 4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa: Status 404 returned error can't find the container with id 4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa Feb 20 09:57:44 crc kubenswrapper[4962]: I0220 09:57:44.543487 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d9655a2-2b08-4827-8126-160f62910b6f","Type":"ContainerStarted","Data":"4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa"} Feb 20 09:57:45 crc kubenswrapper[4962]: I0220 09:57:45.558010 4962 generic.go:334] "Generic (PLEG): container finished" podID="4d9655a2-2b08-4827-8126-160f62910b6f" containerID="919e1eeced82919b575c925e5979fd4022daa4fe766b1bf035955e1bec3ef962" exitCode=0 Feb 20 09:57:45 crc kubenswrapper[4962]: I0220 09:57:45.558112 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d9655a2-2b08-4827-8126-160f62910b6f","Type":"ContainerDied","Data":"919e1eeced82919b575c925e5979fd4022daa4fe766b1bf035955e1bec3ef962"} Feb 20 09:57:45 crc kubenswrapper[4962]: I0220 09:57:45.768127 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4mw9f" Feb 20 09:57:49 crc kubenswrapper[4962]: I0220 09:57:49.609104 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tv8j9" Feb 20 09:57:50 crc kubenswrapper[4962]: I0220 09:57:50.618750 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:50 crc kubenswrapper[4962]: I0220 09:57:50.627375 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 09:57:51 crc kubenswrapper[4962]: I0220 09:57:51.759469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:51 crc kubenswrapper[4962]: I0220 09:57:51.767392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d590527b-ed56-4fb4-a712-b09781618a76-metrics-certs\") pod \"network-metrics-daemon-5bwk2\" (UID: \"d590527b-ed56-4fb4-a712-b09781618a76\") " pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:52 crc kubenswrapper[4962]: I0220 09:57:52.058939 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bwk2" Feb 20 09:57:58 crc kubenswrapper[4962]: I0220 09:57:58.740565 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 09:57:58 crc kubenswrapper[4962]: I0220 09:57:58.992673 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.076365 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") pod \"4d9655a2-2b08-4827-8126-160f62910b6f\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.076476 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") pod \"4d9655a2-2b08-4827-8126-160f62910b6f\" (UID: \"4d9655a2-2b08-4827-8126-160f62910b6f\") " Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.076663 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4d9655a2-2b08-4827-8126-160f62910b6f" (UID: "4d9655a2-2b08-4827-8126-160f62910b6f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.076920 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d9655a2-2b08-4827-8126-160f62910b6f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.081839 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4d9655a2-2b08-4827-8126-160f62910b6f" (UID: "4d9655a2-2b08-4827-8126-160f62910b6f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.178637 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d9655a2-2b08-4827-8126-160f62910b6f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.715917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4d9655a2-2b08-4827-8126-160f62910b6f","Type":"ContainerDied","Data":"4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa"} Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.715971 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2c913b4ac0fae2cf204a2ad08d4804962ccd049601f64fecbf3078b5e329aa" Feb 20 09:57:59 crc kubenswrapper[4962]: I0220 09:57:59.716001 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 09:58:10 crc kubenswrapper[4962]: I0220 09:58:10.707975 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcd2" Feb 20 09:58:11 crc kubenswrapper[4962]: I0220 09:58:11.508644 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:58:11 crc kubenswrapper[4962]: I0220 09:58:11.508719 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.025039 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.025754 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58dmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d4pzn_openshift-marketplace(1c487f78-6735-4114-a45a-6c60ccef5983): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.027216 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d4pzn" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.028311 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.028529 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vspvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6q5bk_openshift-marketplace(e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.029748 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6q5bk" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.178672 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.180008 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dszmr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5qdmt_openshift-marketplace(805f4075-7fda-4a54-882f-c4fd160148a4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.181131 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5qdmt" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.323224 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bwk2"] Feb 20 09:58:12 crc kubenswrapper[4962]: W0220 09:58:12.332901 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd590527b_ed56_4fb4_a712_b09781618a76.slice/crio-23638ae3ed1265eaf40439d0fec35a2389a58acb09ed9f8dbb4e361c39e741c9 WatchSource:0}: Error finding container 23638ae3ed1265eaf40439d0fec35a2389a58acb09ed9f8dbb4e361c39e741c9: Status 404 returned error can't find the container with id 23638ae3ed1265eaf40439d0fec35a2389a58acb09ed9f8dbb4e361c39e741c9 Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.807560 4962 generic.go:334] "Generic (PLEG): container finished" podID="ee660135-f5e2-420e-a242-440471e57da2" containerID="8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8" exitCode=0 Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.807651 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerDied","Data":"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.810690 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerStarted","Data":"077fd78654c04cefa692b832e5d39f0ad8125a113696468af2f8fd05f987f05c"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.814910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" event={"ID":"d590527b-ed56-4fb4-a712-b09781618a76","Type":"ContainerStarted","Data":"c2573cf807293be7622d57a2e5ba8fcbb04759292a41e15cc267e2a1185db35c"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.815058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" event={"ID":"d590527b-ed56-4fb4-a712-b09781618a76","Type":"ContainerStarted","Data":"23638ae3ed1265eaf40439d0fec35a2389a58acb09ed9f8dbb4e361c39e741c9"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.818001 4962 generic.go:334] "Generic (PLEG): container finished" podID="a313fb19-8615-43b7-a19a-df83e50410ba" containerID="0524942f9aa37d9ad7a2475893e19f232848fd053a188018fd7e95cce3d53a2e" exitCode=0 Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.818103 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerDied","Data":"0524942f9aa37d9ad7a2475893e19f232848fd053a188018fd7e95cce3d53a2e"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.821552 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f414667-865d-4c89-b470-50f61a11b60e" containerID="cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a" exitCode=0 Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.821615 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerDied","Data":"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a"} Feb 20 09:58:12 crc kubenswrapper[4962]: I0220 09:58:12.824907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerStarted","Data":"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb"} Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.829522 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d4pzn" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.830077 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6q5bk" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" Feb 20 09:58:12 crc kubenswrapper[4962]: E0220 09:58:12.836378 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5qdmt" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.833080 4962 generic.go:334] "Generic (PLEG): container finished" podID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerID="1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb" exitCode=0 Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.833129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerDied","Data":"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb"} Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.835677 4962 generic.go:334] "Generic (PLEG): container finished" podID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerID="077fd78654c04cefa692b832e5d39f0ad8125a113696468af2f8fd05f987f05c" exitCode=0 Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.835736 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerDied","Data":"077fd78654c04cefa692b832e5d39f0ad8125a113696468af2f8fd05f987f05c"} Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.839305 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bwk2" event={"ID":"d590527b-ed56-4fb4-a712-b09781618a76","Type":"ContainerStarted","Data":"f9c20c25cd1b447d8c70fd86425c66912c97841c988de55739723e06f486c18f"} Feb 20 09:58:13 crc kubenswrapper[4962]: I0220 09:58:13.888826 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5bwk2" podStartSLOduration=164.888804492 podStartE2EDuration="2m44.888804492s" podCreationTimestamp="2026-02-20 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:58:13.885819627 +0000 UTC m=+185.468291473" watchObservedRunningTime="2026-02-20 09:58:13.888804492 +0000 UTC m=+185.471276338" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.859650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerStarted","Data":"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.863876 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerStarted","Data":"e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.866407 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerStarted","Data":"86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.872640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerStarted","Data":"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.874753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerStarted","Data":"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315"} Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.899129 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sxxjg" podStartSLOduration=3.062735422 podStartE2EDuration="37.8991146s" podCreationTimestamp="2026-02-20 09:57:38 +0000 UTC" firstStartedPulling="2026-02-20 09:57:40.455767942 +0000 UTC m=+152.038239778" lastFinishedPulling="2026-02-20 09:58:15.29214706 +0000 UTC m=+186.874618956" observedRunningTime="2026-02-20 09:58:15.884331731 +0000 UTC m=+187.466803577" watchObservedRunningTime="2026-02-20 09:58:15.8991146 +0000 UTC m=+187.481586446" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.926927 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-grl4h" podStartSLOduration=3.019085772 podStartE2EDuration="34.9268875s" podCreationTimestamp="2026-02-20 09:57:41 +0000 UTC" firstStartedPulling="2026-02-20 09:57:43.524816422 +0000 UTC m=+155.107288268" lastFinishedPulling="2026-02-20 09:58:15.43261815 +0000 UTC m=+187.015089996" observedRunningTime="2026-02-20 09:58:15.92218267 +0000 UTC m=+187.504654526" watchObservedRunningTime="2026-02-20 09:58:15.9268875 +0000 UTC m=+187.509359346" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.927436 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zwkjb" podStartSLOduration=3.039981371 podStartE2EDuration="34.927429916s" podCreationTimestamp="2026-02-20 09:57:41 +0000 UTC" firstStartedPulling="2026-02-20 09:57:43.518437539 +0000 UTC m=+155.100909385" lastFinishedPulling="2026-02-20 09:58:15.405886084 +0000 UTC m=+186.988357930" observedRunningTime="2026-02-20 09:58:15.903693135 +0000 UTC m=+187.486164981" watchObservedRunningTime="2026-02-20 09:58:15.927429916 +0000 UTC m=+187.509901762" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.940144 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2gxn5" podStartSLOduration=4.81796757 podStartE2EDuration="35.940129029s" podCreationTimestamp="2026-02-20 09:57:40 +0000 UTC" firstStartedPulling="2026-02-20 09:57:43.529038195 +0000 UTC m=+155.111510041" lastFinishedPulling="2026-02-20 09:58:14.651199654 +0000 UTC m=+186.233671500" observedRunningTime="2026-02-20 09:58:15.940000005 +0000 UTC m=+187.522471861" watchObservedRunningTime="2026-02-20 09:58:15.940129029 +0000 UTC m=+187.522600875" Feb 20 09:58:15 crc kubenswrapper[4962]: I0220 09:58:15.962086 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x4hxs" podStartSLOduration=2.06995262 podStartE2EDuration="35.962071064s" podCreationTimestamp="2026-02-20 09:57:40 +0000 UTC" firstStartedPulling="2026-02-20 09:57:41.487406294 +0000 UTC m=+153.069878140" lastFinishedPulling="2026-02-20 09:58:15.379524738 +0000 UTC m=+186.961996584" observedRunningTime="2026-02-20 09:58:15.960398681 +0000 UTC m=+187.542870527" watchObservedRunningTime="2026-02-20 09:58:15.962071064 +0000 UTC m=+187.544542900" Feb 20 09:58:16 crc kubenswrapper[4962]: I0220 09:58:16.405149 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 09:58:18 crc kubenswrapper[4962]: I0220 09:58:18.580454 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:58:18 crc kubenswrapper[4962]: I0220 09:58:18.580567 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:58:18 crc kubenswrapper[4962]: I0220 09:58:18.717302 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:58:18 crc kubenswrapper[4962]: I0220 09:58:18.829685 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.358815 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 09:58:20 crc kubenswrapper[4962]: E0220 09:58:20.359118 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9655a2-2b08-4827-8126-160f62910b6f" containerName="pruner" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.359133 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9655a2-2b08-4827-8126-160f62910b6f" containerName="pruner" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.359341 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d9655a2-2b08-4827-8126-160f62910b6f" containerName="pruner" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.359888 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.364885 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.365288 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.365414 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.510526 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.510577 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.579995 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.580634 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.611270 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.611320 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.611385 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.646818 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.721682 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.944106 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 09:58:20 crc kubenswrapper[4962]: I0220 09:58:20.978265 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.246031 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.276740 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.277018 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.328639 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.874185 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.874561 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.907321 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e","Type":"ContainerStarted","Data":"b56472da857691a5b95ad0cc186ba642ef9479130c88e2dcbaa014b7f15e8952"} Feb 20 09:58:21 crc kubenswrapper[4962]: I0220 09:58:21.955124 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.197221 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.197274 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.234926 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.910659 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zwkjb" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" probeResult="failure" output=< Feb 20 09:58:22 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 09:58:22 crc kubenswrapper[4962]: > Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.914433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e","Type":"ContainerStarted","Data":"c37f32ab40c4dd0e0fc921352e227a03bbcfb1ba144d4b58dbdfb5d40638c3a6"} Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.930279 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.930254821 podStartE2EDuration="2.930254821s" podCreationTimestamp="2026-02-20 09:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:58:22.927924357 +0000 UTC m=+194.510396203" watchObservedRunningTime="2026-02-20 09:58:22.930254821 +0000 UTC m=+194.512726667" Feb 20 09:58:22 crc kubenswrapper[4962]: I0220 09:58:22.956185 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:23 crc kubenswrapper[4962]: I0220 09:58:23.654920 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:58:23 crc kubenswrapper[4962]: I0220 09:58:23.920643 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" containerID="c37f32ab40c4dd0e0fc921352e227a03bbcfb1ba144d4b58dbdfb5d40638c3a6" exitCode=0 Feb 20 09:58:23 crc kubenswrapper[4962]: I0220 09:58:23.921838 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e","Type":"ContainerDied","Data":"c37f32ab40c4dd0e0fc921352e227a03bbcfb1ba144d4b58dbdfb5d40638c3a6"} Feb 20 09:58:23 crc kubenswrapper[4962]: I0220 09:58:23.922022 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2gxn5" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="registry-server" containerID="cri-o://86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824" gracePeriod=2 Feb 20 09:58:24 crc kubenswrapper[4962]: I0220 09:58:24.654699 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:58:24 crc kubenswrapper[4962]: I0220 09:58:24.927421 4962 generic.go:334] "Generic (PLEG): container finished" podID="a313fb19-8615-43b7-a19a-df83e50410ba" containerID="86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824" exitCode=0 Feb 20 09:58:24 crc kubenswrapper[4962]: I0220 09:58:24.927506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerDied","Data":"86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824"} Feb 20 09:58:24 crc kubenswrapper[4962]: I0220 09:58:24.927674 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-grl4h" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="registry-server" containerID="cri-o://e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42" gracePeriod=2 Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.231292 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.257976 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384774 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") pod \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384861 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") pod \"a313fb19-8615-43b7-a19a-df83e50410ba\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384886 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") pod \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\" (UID: \"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384902 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" (UID: "5d2fb8bc-5778-47f1-8bad-d7ba5a08848e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.384948 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") pod \"a313fb19-8615-43b7-a19a-df83e50410ba\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.385012 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") pod \"a313fb19-8615-43b7-a19a-df83e50410ba\" (UID: \"a313fb19-8615-43b7-a19a-df83e50410ba\") " Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.385275 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.385914 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities" (OuterVolumeSpecName: "utilities") pod "a313fb19-8615-43b7-a19a-df83e50410ba" (UID: "a313fb19-8615-43b7-a19a-df83e50410ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.391442 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" (UID: "5d2fb8bc-5778-47f1-8bad-d7ba5a08848e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.391556 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq" (OuterVolumeSpecName: "kube-api-access-vvgwq") pod "a313fb19-8615-43b7-a19a-df83e50410ba" (UID: "a313fb19-8615-43b7-a19a-df83e50410ba"). InnerVolumeSpecName "kube-api-access-vvgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.409787 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a313fb19-8615-43b7-a19a-df83e50410ba" (UID: "a313fb19-8615-43b7-a19a-df83e50410ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.486221 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvgwq\" (UniqueName: \"kubernetes.io/projected/a313fb19-8615-43b7-a19a-df83e50410ba-kube-api-access-vvgwq\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.486260 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.486273 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d2fb8bc-5778-47f1-8bad-d7ba5a08848e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.486283 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a313fb19-8615-43b7-a19a-df83e50410ba-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.934368 4962 generic.go:334] "Generic (PLEG): container finished" podID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerID="e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42" exitCode=0 Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.934449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerDied","Data":"e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42"} Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.937021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2gxn5" event={"ID":"a313fb19-8615-43b7-a19a-df83e50410ba","Type":"ContainerDied","Data":"1d342c2b45fca81302a1d91ac4539bc994004d9b4834928e5a7a0d50c28cc22c"} Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.937072 4962 scope.go:117] "RemoveContainer" containerID="86f269adf873c3d131ecc0aa7aa18aecc38f46b209e217489bedfd8601293824" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.937239 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2gxn5" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.939247 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5d2fb8bc-5778-47f1-8bad-d7ba5a08848e","Type":"ContainerDied","Data":"b56472da857691a5b95ad0cc186ba642ef9479130c88e2dcbaa014b7f15e8952"} Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.939275 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56472da857691a5b95ad0cc186ba642ef9479130c88e2dcbaa014b7f15e8952" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.939838 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957520 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 09:58:25 crc kubenswrapper[4962]: E0220 09:58:25.957884 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="extract-utilities" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957896 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="extract-utilities" Feb 20 09:58:25 crc kubenswrapper[4962]: E0220 09:58:25.957923 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" containerName="pruner" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957930 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" containerName="pruner" Feb 20 09:58:25 crc kubenswrapper[4962]: E0220 09:58:25.957938 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="extract-content" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957944 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="extract-content" Feb 20 09:58:25 crc kubenswrapper[4962]: E0220 09:58:25.957953 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="registry-server" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.957958 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="registry-server" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.958050 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d2fb8bc-5778-47f1-8bad-d7ba5a08848e" containerName="pruner" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.958058 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" containerName="registry-server" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.960101 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.963316 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.964071 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.970027 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.996208 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:58:25 crc kubenswrapper[4962]: I0220 09:58:25.999214 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2gxn5"] Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.092174 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.092261 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.092283 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193066 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193119 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193263 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.193353 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.209210 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") pod \"installer-9-crc\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:26 crc kubenswrapper[4962]: I0220 09:58:26.292825 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.145468 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a313fb19-8615-43b7-a19a-df83e50410ba" path="/var/lib/kubelet/pods/a313fb19-8615-43b7-a19a-df83e50410ba/volumes" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.543628 4962 scope.go:117] "RemoveContainer" containerID="0524942f9aa37d9ad7a2475893e19f232848fd053a188018fd7e95cce3d53a2e" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.631383 4962 scope.go:117] "RemoveContainer" containerID="67d79612188b2d4ac629f49494e406f3c95c85018ab3966df8e32115cfac1740" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.658315 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.762075 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.813101 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") pod \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.813206 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") pod \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.813249 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") pod \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\" (UID: \"4065ac08-9c62-48db-bbfe-9e53ab7d5463\") " Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.815276 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities" (OuterVolumeSpecName: "utilities") pod "4065ac08-9c62-48db-bbfe-9e53ab7d5463" (UID: "4065ac08-9c62-48db-bbfe-9e53ab7d5463"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.818715 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl" (OuterVolumeSpecName: "kube-api-access-w94dl") pod "4065ac08-9c62-48db-bbfe-9e53ab7d5463" (UID: "4065ac08-9c62-48db-bbfe-9e53ab7d5463"). InnerVolumeSpecName "kube-api-access-w94dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.914332 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w94dl\" (UniqueName: \"kubernetes.io/projected/4065ac08-9c62-48db-bbfe-9e53ab7d5463-kube-api-access-w94dl\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.914357 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.935788 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4065ac08-9c62-48db-bbfe-9e53ab7d5463" (UID: "4065ac08-9c62-48db-bbfe-9e53ab7d5463"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.952613 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grl4h" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.952580 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grl4h" event={"ID":"4065ac08-9c62-48db-bbfe-9e53ab7d5463","Type":"ContainerDied","Data":"50d62aa547d31448aec488d9bebb813e5df408a7d3796523fe1332f39c82e601"} Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.952662 4962 scope.go:117] "RemoveContainer" containerID="e4ecdf4a3c339870dec24d15d23037eaffd696da078f222cbf4a6c2e3f4d4b42" Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.964705 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"55ba5b4b-9a58-40e7-a3a3-00764477f5a9","Type":"ContainerStarted","Data":"9b9dbdcac9ea0d9a44c9e69e43cace295320630fa87bf53a68f617de558a65af"} Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.966676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerStarted","Data":"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d"} Feb 20 09:58:27 crc kubenswrapper[4962]: I0220 09:58:27.975710 4962 scope.go:117] "RemoveContainer" containerID="077fd78654c04cefa692b832e5d39f0ad8125a113696468af2f8fd05f987f05c" Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.000435 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.003239 4962 scope.go:117] "RemoveContainer" containerID="8158f54e7f263dc473b3aa7c7a0204ebc0dcdf73eaeb229d1a5efae6eb63d973" Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.006505 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-grl4h"] Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.015420 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065ac08-9c62-48db-bbfe-9e53ab7d5463-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.621984 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.971957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"55ba5b4b-9a58-40e7-a3a3-00764477f5a9","Type":"ContainerStarted","Data":"d915b56a671459092c2c1eb6c3a687d96ecc073838917251978e78628f894691"} Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.974314 4962 generic.go:334] "Generic (PLEG): container finished" podID="805f4075-7fda-4a54-882f-c4fd160148a4" containerID="a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958" exitCode=0 Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.974520 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerDied","Data":"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958"} Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.977444 4962 generic.go:334] "Generic (PLEG): container finished" podID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerID="61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d" exitCode=0 Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.977549 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerDied","Data":"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d"} Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.982746 4962 generic.go:334] "Generic (PLEG): container finished" podID="1c487f78-6735-4114-a45a-6c60ccef5983" containerID="24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79" exitCode=0 Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.982807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerDied","Data":"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79"} Feb 20 09:58:28 crc kubenswrapper[4962]: I0220 09:58:28.988052 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.988040216 podStartE2EDuration="3.988040216s" podCreationTimestamp="2026-02-20 09:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:58:28.985926046 +0000 UTC m=+200.568397912" watchObservedRunningTime="2026-02-20 09:58:28.988040216 +0000 UTC m=+200.570512062" Feb 20 09:58:29 crc kubenswrapper[4962]: I0220 09:58:29.146912 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" path="/var/lib/kubelet/pods/4065ac08-9c62-48db-bbfe-9e53ab7d5463/volumes" Feb 20 09:58:29 crc kubenswrapper[4962]: I0220 09:58:29.992872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerStarted","Data":"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386"} Feb 20 09:58:29 crc kubenswrapper[4962]: I0220 09:58:29.995709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerStarted","Data":"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0"} Feb 20 09:58:29 crc kubenswrapper[4962]: I0220 09:58:29.999233 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerStarted","Data":"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b"} Feb 20 09:58:30 crc kubenswrapper[4962]: I0220 09:58:30.011868 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d4pzn" podStartSLOduration=2.9540328110000003 podStartE2EDuration="52.011846542s" podCreationTimestamp="2026-02-20 09:57:38 +0000 UTC" firstStartedPulling="2026-02-20 09:57:40.447975995 +0000 UTC m=+152.030447861" lastFinishedPulling="2026-02-20 09:58:29.505789746 +0000 UTC m=+201.088261592" observedRunningTime="2026-02-20 09:58:30.008187263 +0000 UTC m=+201.590659099" watchObservedRunningTime="2026-02-20 09:58:30.011846542 +0000 UTC m=+201.594318388" Feb 20 09:58:30 crc kubenswrapper[4962]: I0220 09:58:30.024683 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6q5bk" podStartSLOduration=2.885081613 podStartE2EDuration="52.024660634s" podCreationTimestamp="2026-02-20 09:57:38 +0000 UTC" firstStartedPulling="2026-02-20 09:57:40.448759699 +0000 UTC m=+152.031231545" lastFinishedPulling="2026-02-20 09:58:29.58833872 +0000 UTC m=+201.170810566" observedRunningTime="2026-02-20 09:58:30.021759989 +0000 UTC m=+201.604231855" watchObservedRunningTime="2026-02-20 09:58:30.024660634 +0000 UTC m=+201.607132480" Feb 20 09:58:30 crc kubenswrapper[4962]: I0220 09:58:30.041146 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5qdmt" podStartSLOduration=3.05185998 podStartE2EDuration="52.041127895s" podCreationTimestamp="2026-02-20 09:57:38 +0000 UTC" firstStartedPulling="2026-02-20 09:57:40.445141535 +0000 UTC m=+152.027613381" lastFinishedPulling="2026-02-20 09:58:29.43440945 +0000 UTC m=+201.016881296" observedRunningTime="2026-02-20 09:58:30.039575935 +0000 UTC m=+201.622047781" watchObservedRunningTime="2026-02-20 09:58:30.041127895 +0000 UTC m=+201.623599741" Feb 20 09:58:31 crc kubenswrapper[4962]: I0220 09:58:31.916534 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:58:31 crc kubenswrapper[4962]: I0220 09:58:31.961004 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 09:58:38 crc kubenswrapper[4962]: I0220 09:58:38.781149 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:58:38 crc kubenswrapper[4962]: I0220 09:58:38.781820 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:58:38 crc kubenswrapper[4962]: I0220 09:58:38.828304 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.024863 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.024954 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.082582 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.116255 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.152372 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.314098 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.314161 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:39 crc kubenswrapper[4962]: I0220 09:58:39.400603 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:40 crc kubenswrapper[4962]: I0220 09:58:40.113898 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:40 crc kubenswrapper[4962]: I0220 09:58:40.713968 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.312215 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.312403 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5qdmt" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="registry-server" containerID="cri-o://213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" gracePeriod=2 Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.507935 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.508005 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.508060 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.509799 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.510070 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432" gracePeriod=600 Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.713335 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.820004 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") pod \"805f4075-7fda-4a54-882f-c4fd160148a4\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.820344 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") pod \"805f4075-7fda-4a54-882f-c4fd160148a4\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.820437 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") pod \"805f4075-7fda-4a54-882f-c4fd160148a4\" (UID: \"805f4075-7fda-4a54-882f-c4fd160148a4\") " Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.821715 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities" (OuterVolumeSpecName: "utilities") pod "805f4075-7fda-4a54-882f-c4fd160148a4" (UID: "805f4075-7fda-4a54-882f-c4fd160148a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.827531 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr" (OuterVolumeSpecName: "kube-api-access-dszmr") pod "805f4075-7fda-4a54-882f-c4fd160148a4" (UID: "805f4075-7fda-4a54-882f-c4fd160148a4"). InnerVolumeSpecName "kube-api-access-dszmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.875901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "805f4075-7fda-4a54-882f-c4fd160148a4" (UID: "805f4075-7fda-4a54-882f-c4fd160148a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.921959 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.921992 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dszmr\" (UniqueName: \"kubernetes.io/projected/805f4075-7fda-4a54-882f-c4fd160148a4-kube-api-access-dszmr\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:41 crc kubenswrapper[4962]: I0220 09:58:41.922005 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/805f4075-7fda-4a54-882f-c4fd160148a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.082895 4962 generic.go:334] "Generic (PLEG): container finished" podID="805f4075-7fda-4a54-882f-c4fd160148a4" containerID="213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" exitCode=0 Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.082944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerDied","Data":"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0"} Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.083366 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qdmt" event={"ID":"805f4075-7fda-4a54-882f-c4fd160148a4","Type":"ContainerDied","Data":"75a64fa0799e34c46374b381b4c7c1a53295cecd1a95e7229c96eb57af23d670"} Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.083390 4962 scope.go:117] "RemoveContainer" containerID="213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.082991 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qdmt" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.085615 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432" exitCode=0 Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.085849 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d4pzn" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="registry-server" containerID="cri-o://86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" gracePeriod=2 Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.086115 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432"} Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.086150 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa"} Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.100556 4962 scope.go:117] "RemoveContainer" containerID="a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.124112 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.128347 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5qdmt"] Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.144523 4962 scope.go:117] "RemoveContainer" containerID="d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.201332 4962 scope.go:117] "RemoveContainer" containerID="213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" Feb 20 09:58:42 crc kubenswrapper[4962]: E0220 09:58:42.201821 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0\": container with ID starting with 213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0 not found: ID does not exist" containerID="213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.201849 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0"} err="failed to get container status \"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0\": rpc error: code = NotFound desc = could not find container \"213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0\": container with ID starting with 213ffeb86a610ad404efe1175c019ccea2a36d18876d078c5b355b656daea0c0 not found: ID does not exist" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.201870 4962 scope.go:117] "RemoveContainer" containerID="a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958" Feb 20 09:58:42 crc kubenswrapper[4962]: E0220 09:58:42.202216 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958\": container with ID starting with a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958 not found: ID does not exist" containerID="a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.202263 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958"} err="failed to get container status \"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958\": rpc error: code = NotFound desc = could not find container \"a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958\": container with ID starting with a7221ac6a9721db1f9eee03cdcfb51961018466a0f227a8d12844f0ff0251958 not found: ID does not exist" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.202289 4962 scope.go:117] "RemoveContainer" containerID="d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4" Feb 20 09:58:42 crc kubenswrapper[4962]: E0220 09:58:42.203025 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4\": container with ID starting with d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4 not found: ID does not exist" containerID="d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.203051 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4"} err="failed to get container status \"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4\": rpc error: code = NotFound desc = could not find container \"d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4\": container with ID starting with d4d98f774860fdf5382198ce0438bc9131dc921e84f72e5b2cde6967daa0a4d4 not found: ID does not exist" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.357922 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.528820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") pod \"1c487f78-6735-4114-a45a-6c60ccef5983\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.528907 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") pod \"1c487f78-6735-4114-a45a-6c60ccef5983\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.528940 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") pod \"1c487f78-6735-4114-a45a-6c60ccef5983\" (UID: \"1c487f78-6735-4114-a45a-6c60ccef5983\") " Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.529692 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities" (OuterVolumeSpecName: "utilities") pod "1c487f78-6735-4114-a45a-6c60ccef5983" (UID: "1c487f78-6735-4114-a45a-6c60ccef5983"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.532208 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf" (OuterVolumeSpecName: "kube-api-access-58dmf") pod "1c487f78-6735-4114-a45a-6c60ccef5983" (UID: "1c487f78-6735-4114-a45a-6c60ccef5983"). InnerVolumeSpecName "kube-api-access-58dmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.577126 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c487f78-6735-4114-a45a-6c60ccef5983" (UID: "1c487f78-6735-4114-a45a-6c60ccef5983"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.629690 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.629728 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58dmf\" (UniqueName: \"kubernetes.io/projected/1c487f78-6735-4114-a45a-6c60ccef5983-kube-api-access-58dmf\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:42 crc kubenswrapper[4962]: I0220 09:58:42.629739 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c487f78-6735-4114-a45a-6c60ccef5983-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093578 4962 generic.go:334] "Generic (PLEG): container finished" podID="1c487f78-6735-4114-a45a-6c60ccef5983" containerID="86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" exitCode=0 Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093631 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerDied","Data":"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386"} Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4pzn" event={"ID":"1c487f78-6735-4114-a45a-6c60ccef5983","Type":"ContainerDied","Data":"7fcd5743db242f51c1ff9cb31c18720e064f45718743b615c36cf8bb2d39f79d"} Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093703 4962 scope.go:117] "RemoveContainer" containerID="86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.093699 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4pzn" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.109089 4962 scope.go:117] "RemoveContainer" containerID="24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.119305 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.125910 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d4pzn"] Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.146474 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" path="/var/lib/kubelet/pods/1c487f78-6735-4114-a45a-6c60ccef5983/volumes" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.147141 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" path="/var/lib/kubelet/pods/805f4075-7fda-4a54-882f-c4fd160148a4/volumes" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.149724 4962 scope.go:117] "RemoveContainer" containerID="43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.166894 4962 scope.go:117] "RemoveContainer" containerID="86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" Feb 20 09:58:43 crc kubenswrapper[4962]: E0220 09:58:43.170148 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386\": container with ID starting with 86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386 not found: ID does not exist" containerID="86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.170223 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386"} err="failed to get container status \"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386\": rpc error: code = NotFound desc = could not find container \"86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386\": container with ID starting with 86f7e48902be77cf09c5783cd3add94d919d049b4c6b81377f10cd15720d4386 not found: ID does not exist" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.170260 4962 scope.go:117] "RemoveContainer" containerID="24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79" Feb 20 09:58:43 crc kubenswrapper[4962]: E0220 09:58:43.170698 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79\": container with ID starting with 24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79 not found: ID does not exist" containerID="24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.170742 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79"} err="failed to get container status \"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79\": rpc error: code = NotFound desc = could not find container \"24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79\": container with ID starting with 24dd6185fd7363afbd3de791c6a0af40a901495e20236ce0f59f202cf94eff79 not found: ID does not exist" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.170772 4962 scope.go:117] "RemoveContainer" containerID="43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25" Feb 20 09:58:43 crc kubenswrapper[4962]: E0220 09:58:43.171666 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25\": container with ID starting with 43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25 not found: ID does not exist" containerID="43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.171699 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25"} err="failed to get container status \"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25\": rpc error: code = NotFound desc = could not find container \"43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25\": container with ID starting with 43fe772c14da6a92b151d20212f73effab905ae6a03f2da84a27e6c3cf3f6b25 not found: ID does not exist" Feb 20 09:58:43 crc kubenswrapper[4962]: I0220 09:58:43.851267 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" containerID="cri-o://7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf" gracePeriod=15 Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.100606 4962 generic.go:334] "Generic (PLEG): container finished" podID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerID="7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf" exitCode=0 Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.100676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" event={"ID":"f8161f87-3814-4d02-84ff-b94b8b05c59e","Type":"ContainerDied","Data":"7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf"} Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.184524 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350453 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350523 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350546 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350567 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350613 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350639 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350664 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350704 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350729 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350759 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350781 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350799 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350818 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.350836 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") pod \"f8161f87-3814-4d02-84ff-b94b8b05c59e\" (UID: \"f8161f87-3814-4d02-84ff-b94b8b05c59e\") " Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.351551 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.351835 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.351999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.352759 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.353448 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.356546 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.357009 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.357512 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr" (OuterVolumeSpecName: "kube-api-access-2c9lr") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "kube-api-access-2c9lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.357532 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.357780 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.358102 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.359819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.360846 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.364134 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f8161f87-3814-4d02-84ff-b94b8b05c59e" (UID: "f8161f87-3814-4d02-84ff-b94b8b05c59e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452237 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452530 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452607 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452667 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452726 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c9lr\" (UniqueName: \"kubernetes.io/projected/f8161f87-3814-4d02-84ff-b94b8b05c59e-kube-api-access-2c9lr\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452784 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452849 4962 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452906 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.452969 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453024 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453087 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453143 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453205 4962 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8161f87-3814-4d02-84ff-b94b8b05c59e-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:44 crc kubenswrapper[4962]: I0220 09:58:44.453270 4962 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8161f87-3814-4d02-84ff-b94b8b05c59e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.110909 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" event={"ID":"f8161f87-3814-4d02-84ff-b94b8b05c59e","Type":"ContainerDied","Data":"5c096e9566721f19e8e59886a3dcebbecb0051a2d044d1f9485cf0be8b3c8877"} Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.110996 4962 scope.go:117] "RemoveContainer" containerID="7cda19df238dbbc1faa67326833d89d8ee6218b1c4e8291b2668f05c3b4f21bf" Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.111145 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mrzbm" Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.160492 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:58:45 crc kubenswrapper[4962]: I0220 09:58:45.167133 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mrzbm"] Feb 20 09:58:47 crc kubenswrapper[4962]: I0220 09:58:47.147368 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" path="/var/lib/kubelet/pods/f8161f87-3814-4d02-84ff-b94b8b05c59e/volumes" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.370221 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7857967b8b-hdxkw"] Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.370964 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.370979 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.370994 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371002 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371015 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371024 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371039 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371050 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371061 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371069 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371083 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371091 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371105 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371114 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="extract-content" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371125 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371133 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="extract-utilities" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371161 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371169 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: E0220 09:58:52.371181 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371190 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371300 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8161f87-3814-4d02-84ff-b94b8b05c59e" containerName="oauth-openshift" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371316 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4065ac08-9c62-48db-bbfe-9e53ab7d5463" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371330 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="805f4075-7fda-4a54-882f-c4fd160148a4" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371340 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c487f78-6735-4114-a45a-6c60ccef5983" containerName="registry-server" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.371787 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.374252 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.374360 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.374795 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.376261 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.377470 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.377999 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.378135 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.378344 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.379067 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.379121 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.379220 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.380164 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.399420 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.401310 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7857967b8b-hdxkw"] Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.404830 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.406254 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553636 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-error\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553703 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-audit-policies\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553743 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-service-ca\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553767 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553872 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-router-certs\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.553926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-login\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554024 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554052 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554090 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1164f259-4f1f-498e-81a1-817747913204-audit-dir\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk2s5\" (UniqueName: \"kubernetes.io/projected/1164f259-4f1f-498e-81a1-817747913204-kube-api-access-wk2s5\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554175 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-session\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.554206 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654551 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-login\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654625 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654655 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654674 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654698 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1164f259-4f1f-498e-81a1-817747913204-audit-dir\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654713 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk2s5\" (UniqueName: \"kubernetes.io/projected/1164f259-4f1f-498e-81a1-817747913204-kube-api-access-wk2s5\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654728 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-session\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654747 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654778 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-error\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654797 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-audit-policies\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-service-ca\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654835 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654843 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1164f259-4f1f-498e-81a1-817747913204-audit-dir\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.654868 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.655354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-router-certs\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.656323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-audit-policies\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.656368 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.656421 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-service-ca\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.656444 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.660324 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.660818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.661023 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.661267 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-router-certs\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.662280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-session\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.671055 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-login\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.671280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk2s5\" (UniqueName: \"kubernetes.io/projected/1164f259-4f1f-498e-81a1-817747913204-kube-api-access-wk2s5\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.671820 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.676118 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1164f259-4f1f-498e-81a1-817747913204-v4-0-config-user-template-error\") pod \"oauth-openshift-7857967b8b-hdxkw\" (UID: \"1164f259-4f1f-498e-81a1-817747913204\") " pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.705958 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:52 crc kubenswrapper[4962]: I0220 09:58:52.891257 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7857967b8b-hdxkw"] Feb 20 09:58:52 crc kubenswrapper[4962]: W0220 09:58:52.900785 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1164f259_4f1f_498e_81a1_817747913204.slice/crio-390d1111bb162049705f085556ff83fb04174d392ca6938fc115e2d4e1649d75 WatchSource:0}: Error finding container 390d1111bb162049705f085556ff83fb04174d392ca6938fc115e2d4e1649d75: Status 404 returned error can't find the container with id 390d1111bb162049705f085556ff83fb04174d392ca6938fc115e2d4e1649d75 Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.160910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" event={"ID":"1164f259-4f1f-498e-81a1-817747913204","Type":"ContainerStarted","Data":"e29fc665c030d5623ab5549a121dd1b3a715ea255ab201968b029743202928c1"} Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.161833 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.161973 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" event={"ID":"1164f259-4f1f-498e-81a1-817747913204","Type":"ContainerStarted","Data":"390d1111bb162049705f085556ff83fb04174d392ca6938fc115e2d4e1649d75"} Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.163747 4962 patch_prober.go:28] interesting pod/oauth-openshift-7857967b8b-hdxkw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.163920 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" podUID="1164f259-4f1f-498e-81a1-817747913204" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Feb 20 09:58:53 crc kubenswrapper[4962]: I0220 09:58:53.185236 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" podStartSLOduration=35.185214901 podStartE2EDuration="35.185214901s" podCreationTimestamp="2026-02-20 09:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:58:53.181348434 +0000 UTC m=+224.763820280" watchObservedRunningTime="2026-02-20 09:58:53.185214901 +0000 UTC m=+224.767686757" Feb 20 09:58:54 crc kubenswrapper[4962]: I0220 09:58:54.173384 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7857967b8b-hdxkw" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.568011 4962 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.569296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.570728 4962 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571122 4962 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571302 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571317 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571294 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571385 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571422 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571433 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571438 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571445 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571452 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571464 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571471 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571482 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571487 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571496 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571501 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571354 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571536 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" gracePeriod=15 Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.571512 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571772 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571926 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571942 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571955 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571976 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.571989 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.572001 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.615158 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619386 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619479 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619514 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619552 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619574 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619626 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619659 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.619681 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721525 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721683 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721716 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721834 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.721925 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722023 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722185 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722241 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722330 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722371 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.722456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: I0220 09:59:05.906033 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:05 crc kubenswrapper[4962]: E0220 09:59:05.936175 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895ec07a01ca766 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 09:59:05.935316838 +0000 UTC m=+237.517788684,LastTimestamp:2026-02-20 09:59:05.935316838 +0000 UTC m=+237.517788684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.236903 4962 generic.go:334] "Generic (PLEG): container finished" podID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" containerID="d915b56a671459092c2c1eb6c3a687d96ecc073838917251978e78628f894691" exitCode=0 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.237014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"55ba5b4b-9a58-40e7-a3a3-00764477f5a9","Type":"ContainerDied","Data":"d915b56a671459092c2c1eb6c3a687d96ecc073838917251978e78628f894691"} Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.238253 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.238794 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.239183 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.239362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7"} Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.239398 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7089e214c78fca8c3b4cb7548e34b64d3996ba2d47307a89e8dc936ed301704b"} Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.239853 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.240463 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.241307 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.242981 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244173 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244682 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" exitCode=0 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244702 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" exitCode=0 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244708 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" exitCode=0 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244715 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" exitCode=2 Feb 20 09:59:06 crc kubenswrapper[4962]: I0220 09:59:06.244735 4962 scope.go:117] "RemoveContainer" containerID="e3256d5e9aadc1c300c0388ad5b9682b874dd957f731bde6642c649b044b16ed" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.274399 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.562636 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.563757 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.564449 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.665630 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.666045 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.676534 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.679308 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.679632 4962 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.679660 4962 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.680041 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="200ms" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.680346 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") pod \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.680524 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") pod \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.680728 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock" (OuterVolumeSpecName: "var-lock") pod "55ba5b4b-9a58-40e7-a3a3-00764477f5a9" (UID: "55ba5b4b-9a58-40e7-a3a3-00764477f5a9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.680748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") pod \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\" (UID: \"55ba5b4b-9a58-40e7-a3a3-00764477f5a9\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.681230 4962 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.681631 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "55ba5b4b-9a58-40e7-a3a3-00764477f5a9" (UID: "55ba5b4b-9a58-40e7-a3a3-00764477f5a9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.696245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "55ba5b4b-9a58-40e7-a3a3-00764477f5a9" (UID: "55ba5b4b-9a58-40e7-a3a3-00764477f5a9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.782676 4962 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.782718 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55ba5b4b-9a58-40e7-a3a3-00764477f5a9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:07 crc kubenswrapper[4962]: E0220 09:59:07.882564 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="400ms" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.980401 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.981505 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.982229 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.982559 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.983065 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984396 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984460 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984490 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984703 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:07 crc kubenswrapper[4962]: I0220 09:59:07.984819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.085453 4962 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.085713 4962 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.085723 4962 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.283460 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="800ms" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.285367 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.286785 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" exitCode=0 Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.286931 4962 scope.go:117] "RemoveContainer" containerID="f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.287008 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.289381 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"55ba5b4b-9a58-40e7-a3a3-00764477f5a9","Type":"ContainerDied","Data":"9b9dbdcac9ea0d9a44c9e69e43cace295320630fa87bf53a68f617de558a65af"} Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.289417 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b9dbdcac9ea0d9a44c9e69e43cace295320630fa87bf53a68f617de558a65af" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.289484 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.310132 4962 scope.go:117] "RemoveContainer" containerID="1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.316207 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.316689 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.317445 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.318177 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.318895 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.319312 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.330161 4962 scope.go:117] "RemoveContainer" containerID="c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.352730 4962 scope.go:117] "RemoveContainer" containerID="fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.370250 4962 scope.go:117] "RemoveContainer" containerID="8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.392472 4962 scope.go:117] "RemoveContainer" containerID="6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.424932 4962 scope.go:117] "RemoveContainer" containerID="f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.425790 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\": container with ID starting with f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0 not found: ID does not exist" containerID="f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.425827 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0"} err="failed to get container status \"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\": rpc error: code = NotFound desc = could not find container \"f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0\": container with ID starting with f4c34879b544fe60d5e7a9e0efc88c33292d4f6dd52b3a6863ffd429bf0b91b0 not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.425850 4962 scope.go:117] "RemoveContainer" containerID="1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.426141 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\": container with ID starting with 1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c not found: ID does not exist" containerID="1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.426192 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c"} err="failed to get container status \"1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\": rpc error: code = NotFound desc = could not find container \"1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c\": container with ID starting with 1d889717a218119d59e3c81c54f2223bce58d255f67d6e2f72535dcd71e7274c not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.426211 4962 scope.go:117] "RemoveContainer" containerID="c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.426557 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\": container with ID starting with c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01 not found: ID does not exist" containerID="c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.426698 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01"} err="failed to get container status \"c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\": rpc error: code = NotFound desc = could not find container \"c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01\": container with ID starting with c4c5afbd1e10c9db692f39fa51d7cc54922933431495d1bb3224b830ee773a01 not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.426829 4962 scope.go:117] "RemoveContainer" containerID="fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.427511 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\": container with ID starting with fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5 not found: ID does not exist" containerID="fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.427645 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5"} err="failed to get container status \"fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\": rpc error: code = NotFound desc = could not find container \"fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5\": container with ID starting with fe7253273ecf6239e1d596818ff222d1502f7e5ce3ac9cd695ee58e1596640e5 not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.427740 4962 scope.go:117] "RemoveContainer" containerID="8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.428685 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\": container with ID starting with 8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc not found: ID does not exist" containerID="8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.428738 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc"} err="failed to get container status \"8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\": rpc error: code = NotFound desc = could not find container \"8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc\": container with ID starting with 8c6b31a1ad7fb51f86c160db56cbca34ca2510c90e75a137b92d07f9f81b06bc not found: ID does not exist" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.428779 4962 scope.go:117] "RemoveContainer" containerID="6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04" Feb 20 09:59:08 crc kubenswrapper[4962]: E0220 09:59:08.429378 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\": container with ID starting with 6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04 not found: ID does not exist" containerID="6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04" Feb 20 09:59:08 crc kubenswrapper[4962]: I0220 09:59:08.429464 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04"} err="failed to get container status \"6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\": rpc error: code = NotFound desc = could not find container \"6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04\": container with ID starting with 6f246d5c9a0125c6bd3cb3299352cf9ffc1361d11f14e72cd2b4989f386abc04 not found: ID does not exist" Feb 20 09:59:09 crc kubenswrapper[4962]: E0220 09:59:09.085549 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="1.6s" Feb 20 09:59:09 crc kubenswrapper[4962]: I0220 09:59:09.142562 4962 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:09 crc kubenswrapper[4962]: I0220 09:59:09.142939 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:09 crc kubenswrapper[4962]: I0220 09:59:09.143519 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:09 crc kubenswrapper[4962]: I0220 09:59:09.150464 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 20 09:59:10 crc kubenswrapper[4962]: E0220 09:59:10.686741 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="3.2s" Feb 20 09:59:13 crc kubenswrapper[4962]: E0220 09:59:13.888133 4962 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="6.4s" Feb 20 09:59:13 crc kubenswrapper[4962]: E0220 09:59:13.944882 4962 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895ec07a01ca766 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 09:59:05.935316838 +0000 UTC m=+237.517788684,LastTimestamp:2026-02-20 09:59:05.935316838 +0000 UTC m=+237.517788684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.138717 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.140013 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.140715 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.157288 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.157329 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:18 crc kubenswrapper[4962]: E0220 09:59:18.158045 4962 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.158846 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:18 crc kubenswrapper[4962]: I0220 09:59:18.354272 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c238ddfc7fab35ecb5482258da3b658be3500491461c24047354a98eb6a27f64"} Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.148677 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.149650 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.149851 4962 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.363017 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.363157 4962 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce" exitCode=1 Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.363203 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce"} Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.363840 4962 scope.go:117] "RemoveContainer" containerID="1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.364013 4962 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.364503 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365372 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365548 4962 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="bb2b9882d8095d2704e3f8cba7c2cdd33ad274119271e735b71a0c38b3733d31" exitCode=0 Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"bb2b9882d8095d2704e3f8cba7c2cdd33ad274119271e735b71a0c38b3733d31"} Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365780 4962 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365902 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.365931 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:19 crc kubenswrapper[4962]: E0220 09:59:19.366256 4962 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.366576 4962 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.366879 4962 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.367163 4962 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:19 crc kubenswrapper[4962]: I0220 09:59:19.367384 4962 status_manager.go:851] "Failed to get status for pod" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.380889 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f9aec2bc3a9f2b3c8a19b0af95d99106f14a4ab629a1908b28f64cc2ef6d06d0"} Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.381250 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8f204e6a4667241d9c4421e17e7d6907b9abc4384380a51d11622983c8002b1b"} Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.381267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"260a8f62c826968d9b2df75e761a19b013bee69404c14d6dbde556f2879025e0"} Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.381280 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2cf6a1fa04cc10bc933a2532bafa890f666b3d24cd238d729a7466aad6819739"} Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.389401 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 09:59:20 crc kubenswrapper[4962]: I0220 09:59:20.389563 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5"} Feb 20 09:59:21 crc kubenswrapper[4962]: I0220 09:59:21.399540 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ebb3720158348b5d0d1f1ae9565aa45deed26ebb9a228e63f9bfe291fde16b7d"} Feb 20 09:59:21 crc kubenswrapper[4962]: I0220 09:59:21.400149 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:21 crc kubenswrapper[4962]: I0220 09:59:21.400167 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:21 crc kubenswrapper[4962]: I0220 09:59:21.400448 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:22 crc kubenswrapper[4962]: I0220 09:59:22.458070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:59:22 crc kubenswrapper[4962]: I0220 09:59:22.458291 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 09:59:22 crc kubenswrapper[4962]: I0220 09:59:22.458328 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 09:59:23 crc kubenswrapper[4962]: I0220 09:59:23.159197 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:23 crc kubenswrapper[4962]: I0220 09:59:23.159260 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:23 crc kubenswrapper[4962]: I0220 09:59:23.168649 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:26 crc kubenswrapper[4962]: I0220 09:59:26.429431 4962 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:26 crc kubenswrapper[4962]: I0220 09:59:26.570407 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:59:27 crc kubenswrapper[4962]: I0220 09:59:27.438440 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:27 crc kubenswrapper[4962]: I0220 09:59:27.438491 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:27 crc kubenswrapper[4962]: I0220 09:59:27.445394 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:27 crc kubenswrapper[4962]: I0220 09:59:27.450281 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="032c0e04-2eae-47f8-94e5-4e93feb99a65" Feb 20 09:59:28 crc kubenswrapper[4962]: I0220 09:59:28.443165 4962 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:28 crc kubenswrapper[4962]: I0220 09:59:28.443866 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="deab583c-05c7-4b7e-a3f6-c01081b17127" Feb 20 09:59:29 crc kubenswrapper[4962]: I0220 09:59:29.173809 4962 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="032c0e04-2eae-47f8-94e5-4e93feb99a65" Feb 20 09:59:32 crc kubenswrapper[4962]: I0220 09:59:32.458210 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 09:59:32 crc kubenswrapper[4962]: I0220 09:59:32.458815 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 09:59:35 crc kubenswrapper[4962]: I0220 09:59:35.952512 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 09:59:35 crc kubenswrapper[4962]: I0220 09:59:35.983408 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 09:59:36 crc kubenswrapper[4962]: I0220 09:59:36.708518 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 09:59:36 crc kubenswrapper[4962]: I0220 09:59:36.737246 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 09:59:36 crc kubenswrapper[4962]: I0220 09:59:36.763356 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 09:59:37 crc kubenswrapper[4962]: I0220 09:59:37.324210 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 09:59:37 crc kubenswrapper[4962]: I0220 09:59:37.554934 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 09:59:37 crc kubenswrapper[4962]: I0220 09:59:37.587780 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.167401 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.216424 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.334644 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.464096 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.464550 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.467188 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.609297 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.786329 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.793605 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.951755 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 09:59:38 crc kubenswrapper[4962]: I0220 09:59:38.975573 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.003179 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.034436 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.273495 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.316623 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.474346 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.673316 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 09:59:39 crc kubenswrapper[4962]: I0220 09:59:39.716787 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.026473 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.351183 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.385931 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.421587 4962 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.611006 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.694734 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.696633 4962 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.703188 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=35.703150216 podStartE2EDuration="35.703150216s" podCreationTimestamp="2026-02-20 09:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:59:25.991504813 +0000 UTC m=+257.573976689" watchObservedRunningTime="2026-02-20 09:59:40.703150216 +0000 UTC m=+272.285622102" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.706166 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.706237 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.711369 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.726951 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.726926615 podStartE2EDuration="14.726926615s" podCreationTimestamp="2026-02-20 09:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:59:40.723685433 +0000 UTC m=+272.306157299" watchObservedRunningTime="2026-02-20 09:59:40.726926615 +0000 UTC m=+272.309398471" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.790262 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.912479 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.913171 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.927852 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.970327 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.976043 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 09:59:40 crc kubenswrapper[4962]: I0220 09:59:40.993887 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.002338 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.008120 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.054752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.135774 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.218914 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.473147 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.555740 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.749814 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.888885 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 09:59:41 crc kubenswrapper[4962]: I0220 09:59:41.906814 4962 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.043701 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.197857 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.318790 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.333933 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.376550 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.458228 4962 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.458359 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.458452 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.459446 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.459639 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5" gracePeriod=30 Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.471891 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.561239 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.561415 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.577282 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.578316 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.596755 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.600512 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.605405 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.628113 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.678893 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.692704 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.758173 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.779817 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.812781 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.824470 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.830881 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.875825 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.900901 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.914213 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.917076 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.921399 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.932808 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 09:59:42 crc kubenswrapper[4962]: I0220 09:59:42.959448 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.054217 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.098641 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.188392 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.291510 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.404080 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.449334 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.450729 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.475715 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.483701 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.679640 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.715500 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.799360 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.823182 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.836573 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.842942 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.846003 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.869776 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.887185 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.921477 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.934069 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 09:59:43 crc kubenswrapper[4962]: I0220 09:59:43.943806 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.151757 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.160938 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.360677 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.395019 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.395103 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.424691 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.480228 4962 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.509790 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.604109 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.620499 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.658883 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.683769 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.687992 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.736744 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.757633 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.801048 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.855110 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.914583 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.968197 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 09:59:44 crc kubenswrapper[4962]: I0220 09:59:44.981507 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.074125 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.077497 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.117821 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.154537 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.177339 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.184934 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.240707 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.295078 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.497661 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.503937 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.523857 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.531926 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.630560 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.698205 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.720949 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.748940 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.789262 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.800890 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.815312 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.886490 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.911326 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.938437 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 09:59:45 crc kubenswrapper[4962]: I0220 09:59:45.992037 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.069234 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.079095 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.194019 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.254619 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.257354 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.280285 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.304964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.330507 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.341568 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.343995 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.396499 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.423941 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.776103 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.799083 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.852696 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.852732 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.925518 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 09:59:46 crc kubenswrapper[4962]: I0220 09:59:46.935869 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.140770 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.182838 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.192148 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.214855 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.312098 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.336532 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.405309 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.489883 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.512100 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.540540 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.577909 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.578039 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.638449 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.643328 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.658855 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.662045 4962 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.689994 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.719269 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.751201 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.926798 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.956465 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 09:59:47 crc kubenswrapper[4962]: I0220 09:59:47.989290 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.012043 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.126410 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.149156 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.253691 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.308802 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.314766 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.327278 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.330885 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.395697 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.627416 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.647109 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.667768 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.675977 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.690567 4962 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.690825 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" gracePeriod=5 Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.844676 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 09:59:48 crc kubenswrapper[4962]: I0220 09:59:48.885717 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.000902 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.018892 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.075835 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.107442 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.113638 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.164488 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.165259 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.232203 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.270038 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.307826 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.412024 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.427829 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.478722 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.480743 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.602791 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.732868 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.768503 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.768773 4962 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.811555 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.822198 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 09:59:49 crc kubenswrapper[4962]: I0220 09:59:49.988530 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.063629 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.082702 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.173724 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.194033 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.357405 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.474743 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.483666 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.593927 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.598398 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.612229 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 09:59:50 crc kubenswrapper[4962]: I0220 09:59:50.733989 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.003548 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.127623 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.251362 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.264536 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.272523 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.310325 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.327457 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.639638 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.667115 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.909663 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.909677 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 09:59:51 crc kubenswrapper[4962]: I0220 09:59:51.991468 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.259830 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.319957 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.402104 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.640582 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 09:59:52 crc kubenswrapper[4962]: I0220 09:59:52.670468 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 09:59:53 crc kubenswrapper[4962]: I0220 09:59:53.023621 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 09:59:53 crc kubenswrapper[4962]: I0220 09:59:53.092391 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 09:59:53 crc kubenswrapper[4962]: I0220 09:59:53.313427 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.380584 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.380671 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520016 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520154 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520192 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520181 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520216 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520298 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520322 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520465 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.520544 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.521121 4962 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.521166 4962 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.521185 4962 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.521204 4962 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.534016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.613387 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.622066 4962 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.627113 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.627260 4962 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" exitCode=137 Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.627350 4962 scope.go:117] "RemoveContainer" containerID="77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.627809 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.649251 4962 scope.go:117] "RemoveContainer" containerID="77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" Feb 20 09:59:54 crc kubenswrapper[4962]: E0220 09:59:54.649806 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7\": container with ID starting with 77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7 not found: ID does not exist" containerID="77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.649838 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7"} err="failed to get container status \"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7\": rpc error: code = NotFound desc = could not find container \"77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7\": container with ID starting with 77a18449aef0d858619c12a32e2d02b931cc3374be518db01b0791e8510c9ca7 not found: ID does not exist" Feb 20 09:59:54 crc kubenswrapper[4962]: I0220 09:59:54.786721 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.146839 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.147130 4962 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.157966 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.158002 4962 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="35a53ee2-5747-4fe8-89c7-97453524e674" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.161911 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.161954 4962 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="35a53ee2-5747-4fe8-89c7-97453524e674" Feb 20 09:59:55 crc kubenswrapper[4962]: I0220 09:59:55.418788 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.875554 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.877888 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sxxjg" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="registry-server" containerID="cri-o://c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" gracePeriod=30 Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.881467 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.881955 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6q5bk" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="registry-server" containerID="cri-o://ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" gracePeriod=30 Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.896353 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.896626 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" containerID="cri-o://3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" gracePeriod=30 Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.901737 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.902007 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x4hxs" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="registry-server" containerID="cri-o://4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" gracePeriod=30 Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.905553 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 10:00:05 crc kubenswrapper[4962]: I0220 10:00:05.907730 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zwkjb" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" containerID="cri-o://24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" gracePeriod=30 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.327817 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.334021 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.338719 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.341914 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.344460 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474403 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") pod \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474459 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") pod \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474510 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") pod \"ee660135-f5e2-420e-a242-440471e57da2\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") pod \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474578 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") pod \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474623 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") pod \"2f414667-865d-4c89-b470-50f61a11b60e\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474670 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") pod \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474698 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") pod \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474728 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") pod \"ee660135-f5e2-420e-a242-440471e57da2\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474768 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") pod \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\" (UID: \"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474801 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") pod \"2f414667-865d-4c89-b470-50f61a11b60e\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474820 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") pod \"ee660135-f5e2-420e-a242-440471e57da2\" (UID: \"ee660135-f5e2-420e-a242-440471e57da2\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474846 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") pod \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\" (UID: \"bfd57a5c-0892-46a0-8005-0a8f70c146fd\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474866 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") pod \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\" (UID: \"77564a1c-aefc-4caf-86d9-55c2ef795bb7\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.474887 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") pod \"2f414667-865d-4c89-b470-50f61a11b60e\" (UID: \"2f414667-865d-4c89-b470-50f61a11b60e\") " Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.478069 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities" (OuterVolumeSpecName: "utilities") pod "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" (UID: "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.478267 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bfd57a5c-0892-46a0-8005-0a8f70c146fd" (UID: "bfd57a5c-0892-46a0-8005-0a8f70c146fd"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.478667 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities" (OuterVolumeSpecName: "utilities") pod "ee660135-f5e2-420e-a242-440471e57da2" (UID: "ee660135-f5e2-420e-a242-440471e57da2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.479783 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities" (OuterVolumeSpecName: "utilities") pod "77564a1c-aefc-4caf-86d9-55c2ef795bb7" (UID: "77564a1c-aefc-4caf-86d9-55c2ef795bb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.481017 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities" (OuterVolumeSpecName: "utilities") pod "2f414667-865d-4c89-b470-50f61a11b60e" (UID: "2f414667-865d-4c89-b470-50f61a11b60e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.483999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6" (OuterVolumeSpecName: "kube-api-access-4dxf6") pod "2f414667-865d-4c89-b470-50f61a11b60e" (UID: "2f414667-865d-4c89-b470-50f61a11b60e"). InnerVolumeSpecName "kube-api-access-4dxf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.484477 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr" (OuterVolumeSpecName: "kube-api-access-hqtxr") pod "ee660135-f5e2-420e-a242-440471e57da2" (UID: "ee660135-f5e2-420e-a242-440471e57da2"). InnerVolumeSpecName "kube-api-access-hqtxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.486006 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz" (OuterVolumeSpecName: "kube-api-access-rt6vz") pod "bfd57a5c-0892-46a0-8005-0a8f70c146fd" (UID: "bfd57a5c-0892-46a0-8005-0a8f70c146fd"). InnerVolumeSpecName "kube-api-access-rt6vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.486328 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg" (OuterVolumeSpecName: "kube-api-access-vspvg") pod "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" (UID: "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b"). InnerVolumeSpecName "kube-api-access-vspvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.488298 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bfd57a5c-0892-46a0-8005-0a8f70c146fd" (UID: "bfd57a5c-0892-46a0-8005-0a8f70c146fd"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.488342 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc" (OuterVolumeSpecName: "kube-api-access-wpnrc") pod "77564a1c-aefc-4caf-86d9-55c2ef795bb7" (UID: "77564a1c-aefc-4caf-86d9-55c2ef795bb7"). InnerVolumeSpecName "kube-api-access-wpnrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.510388 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f414667-865d-4c89-b470-50f61a11b60e" (UID: "2f414667-865d-4c89-b470-50f61a11b60e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.550027 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee660135-f5e2-420e-a242-440471e57da2" (UID: "ee660135-f5e2-420e-a242-440471e57da2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576096 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576127 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576145 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt6vz\" (UniqueName: \"kubernetes.io/projected/bfd57a5c-0892-46a0-8005-0a8f70c146fd-kube-api-access-rt6vz\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576157 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dxf6\" (UniqueName: \"kubernetes.io/projected/2f414667-865d-4c89-b470-50f61a11b60e-kube-api-access-4dxf6\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576168 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vspvg\" (UniqueName: \"kubernetes.io/projected/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-kube-api-access-vspvg\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576181 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576191 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqtxr\" (UniqueName: \"kubernetes.io/projected/ee660135-f5e2-420e-a242-440471e57da2-kube-api-access-hqtxr\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576203 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576218 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee660135-f5e2-420e-a242-440471e57da2-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576229 4962 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfd57a5c-0892-46a0-8005-0a8f70c146fd-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576270 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f414667-865d-4c89-b470-50f61a11b60e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576283 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpnrc\" (UniqueName: \"kubernetes.io/projected/77564a1c-aefc-4caf-86d9-55c2ef795bb7-kube-api-access-wpnrc\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.576294 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.585910 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" (UID: "e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.629855 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77564a1c-aefc-4caf-86d9-55c2ef795bb7" (UID: "77564a1c-aefc-4caf-86d9-55c2ef795bb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.677915 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.677963 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77564a1c-aefc-4caf-86d9-55c2ef795bb7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.704713 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerID="3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.704830 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.704866 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" event={"ID":"bfd57a5c-0892-46a0-8005-0a8f70c146fd","Type":"ContainerDied","Data":"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.705035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m7z5r" event={"ID":"bfd57a5c-0892-46a0-8005-0a8f70c146fd","Type":"ContainerDied","Data":"c81440f2bd45daadf6efa1fe9a3de8fa8cfa794ff12c8106c2aad73b69faa130"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.705077 4962 scope.go:117] "RemoveContainer" containerID="3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.709236 4962 generic.go:334] "Generic (PLEG): container finished" podID="2f414667-865d-4c89-b470-50f61a11b60e" containerID="4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.709331 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4hxs" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.709331 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerDied","Data":"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.709427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4hxs" event={"ID":"2f414667-865d-4c89-b470-50f61a11b60e","Type":"ContainerDied","Data":"4ba52fa324168e2ee08b42cdefbfc041b14744aa5c09a51cbc5628b6f08e9f57"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.715778 4962 generic.go:334] "Generic (PLEG): container finished" podID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerID="24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.715896 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwkjb" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.716018 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerDied","Data":"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.716118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwkjb" event={"ID":"77564a1c-aefc-4caf-86d9-55c2ef795bb7","Type":"ContainerDied","Data":"962bcd42638f9814f3627f1f0129094057257d45837d02365bb3acaa7e0e1287"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.721099 4962 generic.go:334] "Generic (PLEG): container finished" podID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerID="ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.721185 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q5bk" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.721213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerDied","Data":"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.721295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q5bk" event={"ID":"e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b","Type":"ContainerDied","Data":"dd70ef6c640a62edc318879e7e0b88b18026337e7b55ef136a0601bdad9e609c"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.727924 4962 generic.go:334] "Generic (PLEG): container finished" podID="ee660135-f5e2-420e-a242-440471e57da2" containerID="c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" exitCode=0 Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.727979 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerDied","Data":"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.728034 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxxjg" event={"ID":"ee660135-f5e2-420e-a242-440471e57da2","Type":"ContainerDied","Data":"ca9165de15eb6be88321d9382abfd900310af2761fe4b1a318a48f4e2a654377"} Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.728059 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxxjg" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.742288 4962 scope.go:117] "RemoveContainer" containerID="3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.743182 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104\": container with ID starting with 3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104 not found: ID does not exist" containerID="3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.743226 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104"} err="failed to get container status \"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104\": rpc error: code = NotFound desc = could not find container \"3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104\": container with ID starting with 3f40bf15a59c7c1070b5bd6c7194b1d261fd486ea26ee5fe5a2136eea6b42104 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.743250 4962 scope.go:117] "RemoveContainer" containerID="4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.756347 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.756395 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m7z5r"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.767998 4962 scope.go:117] "RemoveContainer" containerID="cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.768630 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.784757 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4hxs"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.790139 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.803069 4962 scope.go:117] "RemoveContainer" containerID="4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.809062 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6q5bk"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.821259 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.824065 4962 scope.go:117] "RemoveContainer" containerID="4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.824662 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786\": container with ID starting with 4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786 not found: ID does not exist" containerID="4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.824746 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786"} err="failed to get container status \"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786\": rpc error: code = NotFound desc = could not find container \"4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786\": container with ID starting with 4ce202551bb2eddc69f068b448a322418266ac5a3082344b5fcff013fbcae786 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.824783 4962 scope.go:117] "RemoveContainer" containerID="cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.825079 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a\": container with ID starting with cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a not found: ID does not exist" containerID="cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.825099 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a"} err="failed to get container status \"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a\": rpc error: code = NotFound desc = could not find container \"cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a\": container with ID starting with cb6843df783564d40f39b05096d7a2c2fbad16d3934305a8cb72a8d5cbd3114a not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.825113 4962 scope.go:117] "RemoveContainer" containerID="4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.825389 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf\": container with ID starting with 4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf not found: ID does not exist" containerID="4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.825551 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf"} err="failed to get container status \"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf\": rpc error: code = NotFound desc = could not find container \"4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf\": container with ID starting with 4e161f889069c1127c3dd292fd14a5054462adfcd79d93a357d1211fdfa99ddf not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.825635 4962 scope.go:117] "RemoveContainer" containerID="24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.828648 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zwkjb"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.831170 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.836380 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sxxjg"] Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.842778 4962 scope.go:117] "RemoveContainer" containerID="1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.869249 4962 scope.go:117] "RemoveContainer" containerID="7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.886411 4962 scope.go:117] "RemoveContainer" containerID="24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.887170 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315\": container with ID starting with 24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315 not found: ID does not exist" containerID="24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.887220 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315"} err="failed to get container status \"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315\": rpc error: code = NotFound desc = could not find container \"24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315\": container with ID starting with 24817e71e406b99f371ea185e4598cf17fd11efcec8292441f76250e784ed315 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.887253 4962 scope.go:117] "RemoveContainer" containerID="1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.887814 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb\": container with ID starting with 1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb not found: ID does not exist" containerID="1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.887860 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb"} err="failed to get container status \"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb\": rpc error: code = NotFound desc = could not find container \"1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb\": container with ID starting with 1a60d207578330039d031b53f48d4fc073b3f6459c7aa8e38e584d0a445150eb not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.887893 4962 scope.go:117] "RemoveContainer" containerID="7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.888193 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1\": container with ID starting with 7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1 not found: ID does not exist" containerID="7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.888226 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1"} err="failed to get container status \"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1\": rpc error: code = NotFound desc = could not find container \"7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1\": container with ID starting with 7bd10cd083ce022229fb704a11e95a4c9966b71b647ed8df13a05a150919c6d1 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.888245 4962 scope.go:117] "RemoveContainer" containerID="ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.901986 4962 scope.go:117] "RemoveContainer" containerID="61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.922053 4962 scope.go:117] "RemoveContainer" containerID="9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.944668 4962 scope.go:117] "RemoveContainer" containerID="ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.945129 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b\": container with ID starting with ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b not found: ID does not exist" containerID="ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.945161 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b"} err="failed to get container status \"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b\": rpc error: code = NotFound desc = could not find container \"ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b\": container with ID starting with ba06f09e9e64dd347907637b7f6269c868262f4e7992b5119a5d49ae51e79a9b not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.945184 4962 scope.go:117] "RemoveContainer" containerID="61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.946102 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d\": container with ID starting with 61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d not found: ID does not exist" containerID="61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.946135 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d"} err="failed to get container status \"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d\": rpc error: code = NotFound desc = could not find container \"61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d\": container with ID starting with 61a0c395420db4427edd1d39e79932f951ebf822d41da5a31f0ddfcbb34a4c3d not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.946158 4962 scope.go:117] "RemoveContainer" containerID="9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.946583 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c\": container with ID starting with 9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c not found: ID does not exist" containerID="9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.946713 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c"} err="failed to get container status \"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c\": rpc error: code = NotFound desc = could not find container \"9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c\": container with ID starting with 9dde5da077306b26cf751b46204885c0add8e973f2983ddbba0abd21dba3f82c not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.946768 4962 scope.go:117] "RemoveContainer" containerID="c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.960782 4962 scope.go:117] "RemoveContainer" containerID="8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.980506 4962 scope.go:117] "RemoveContainer" containerID="eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.995687 4962 scope.go:117] "RemoveContainer" containerID="c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.996213 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849\": container with ID starting with c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849 not found: ID does not exist" containerID="c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.996260 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849"} err="failed to get container status \"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849\": rpc error: code = NotFound desc = could not find container \"c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849\": container with ID starting with c35279cd4b22f47dcf9a72e50cad12450bbbfefd1eb056f72fee9a7b914b6849 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.996293 4962 scope.go:117] "RemoveContainer" containerID="8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.996755 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8\": container with ID starting with 8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8 not found: ID does not exist" containerID="8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.996777 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8"} err="failed to get container status \"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8\": rpc error: code = NotFound desc = could not find container \"8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8\": container with ID starting with 8dd9df9f917ccc178d3800d351c7071cb535c428d75ebffc1bc016597dd217d8 not found: ID does not exist" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.996796 4962 scope.go:117] "RemoveContainer" containerID="eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c" Feb 20 10:00:06 crc kubenswrapper[4962]: E0220 10:00:06.997922 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c\": container with ID starting with eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c not found: ID does not exist" containerID="eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c" Feb 20 10:00:06 crc kubenswrapper[4962]: I0220 10:00:06.997962 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c"} err="failed to get container status \"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c\": rpc error: code = NotFound desc = could not find container \"eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c\": container with ID starting with eb59bf233daf0a387876d536e9cf576dcc0d473830269c4e813b1ac561a0017c not found: ID does not exist" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.146732 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f414667-865d-4c89-b470-50f61a11b60e" path="/var/lib/kubelet/pods/2f414667-865d-4c89-b470-50f61a11b60e/volumes" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.147734 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" path="/var/lib/kubelet/pods/77564a1c-aefc-4caf-86d9-55c2ef795bb7/volumes" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.148381 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" path="/var/lib/kubelet/pods/bfd57a5c-0892-46a0-8005-0a8f70c146fd/volumes" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.149762 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" path="/var/lib/kubelet/pods/e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b/volumes" Feb 20 10:00:07 crc kubenswrapper[4962]: I0220 10:00:07.150389 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee660135-f5e2-420e-a242-440471e57da2" path="/var/lib/kubelet/pods/ee660135-f5e2-420e-a242-440471e57da2/volumes" Feb 20 10:00:08 crc kubenswrapper[4962]: I0220 10:00:08.898854 4962 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.770995 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.774501 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.774629 4962 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5" exitCode=137 Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.774664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9338aa40f3b1f6e7f18273465539b91c996c7687cd237637ca783e0d5f9e51a5"} Feb 20 10:00:12 crc kubenswrapper[4962]: I0220 10:00:12.774809 4962 scope.go:117] "RemoveContainer" containerID="1f4690722fa343ac9bbe83f1591ce49040d2b6cbe966fc31757944ba3befbbce" Feb 20 10:00:13 crc kubenswrapper[4962]: I0220 10:00:13.782835 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 20 10:00:13 crc kubenswrapper[4962]: I0220 10:00:13.784095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"41526e761d4a88d1fb89f92c4de23bad31a37ec70f99f184c60c52623f2183f3"} Feb 20 10:00:16 crc kubenswrapper[4962]: I0220 10:00:16.570425 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 10:00:22 crc kubenswrapper[4962]: I0220 10:00:22.458070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 10:00:22 crc kubenswrapper[4962]: I0220 10:00:22.462426 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 10:00:26 crc kubenswrapper[4962]: I0220 10:00:26.576964 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.850065 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.850841 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" containerID="cri-o://d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" gracePeriod=30 Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.853150 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.853396 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerName="route-controller-manager" containerID="cri-o://1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" gracePeriod=30 Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863330 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mzhb4"] Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863659 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863678 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863696 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863704 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863717 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863725 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863734 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863741 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863750 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863760 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863770 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863779 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863788 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863796 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863806 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863813 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863821 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863831 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863840 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863847 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863857 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863864 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863873 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863879 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="extract-utilities" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863892 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863898 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863908 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" containerName="installer" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863915 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" containerName="installer" Feb 20 10:00:30 crc kubenswrapper[4962]: E0220 10:00:30.863924 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.863932 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="extract-content" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864042 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="77564a1c-aefc-4caf-86d9-55c2ef795bb7" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864058 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f414667-865d-4c89-b470-50f61a11b60e" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864072 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ba5b4b-9a58-40e7-a3a3-00764477f5a9" containerName="installer" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864082 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd57a5c-0892-46a0-8005-0a8f70c146fd" containerName="marketplace-operator" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864095 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3fc7462-bfd9-4a88-ba4c-11d992d0ab3b" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864103 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee660135-f5e2-420e-a242-440471e57da2" containerName="registry-server" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864113 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.864649 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.867298 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.868330 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.868520 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.869248 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.880728 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mzhb4"] Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.883056 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.953888 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.953967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcj67\" (UniqueName: \"kubernetes.io/projected/34e2f7a3-366d-4817-a502-720b5f9a782e-kube-api-access-jcj67\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.954169 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.989433 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.990204 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.993486 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.993690 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:00:30 crc kubenswrapper[4962]: I0220 10:00:30.998103 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.055027 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.055088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.055133 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcj67\" (UniqueName: \"kubernetes.io/projected/34e2f7a3-366d-4817-a502-720b5f9a782e-kube-api-access-jcj67\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.056696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.066732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34e2f7a3-366d-4817-a502-720b5f9a782e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.080886 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcj67\" (UniqueName: \"kubernetes.io/projected/34e2f7a3-366d-4817-a502-720b5f9a782e-kube-api-access-jcj67\") pod \"marketplace-operator-79b997595-mzhb4\" (UID: \"34e2f7a3-366d-4817-a502-720b5f9a782e\") " pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.156102 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.156151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.156176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.192454 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.259491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.259551 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.259620 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.266689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.267802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.290053 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") pod \"collect-profiles-29526360-m2h52\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.329374 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.330999 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.383445 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461476 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461602 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461644 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461671 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") pod \"8da2028c-f296-4f44-b010-b3abec9f6b98\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461708 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") pod \"8da2028c-f296-4f44-b010-b3abec9f6b98\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461728 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") pod \"8da2028c-f296-4f44-b010-b3abec9f6b98\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461753 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461841 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") pod \"6adbe475-48f9-4ba3-82bd-b36bcd939168\" (UID: \"6adbe475-48f9-4ba3-82bd-b36bcd939168\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.461871 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") pod \"8da2028c-f296-4f44-b010-b3abec9f6b98\" (UID: \"8da2028c-f296-4f44-b010-b3abec9f6b98\") " Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.462955 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.463875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca" (OuterVolumeSpecName: "client-ca") pod "8da2028c-f296-4f44-b010-b3abec9f6b98" (UID: "8da2028c-f296-4f44-b010-b3abec9f6b98"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.464268 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca" (OuterVolumeSpecName: "client-ca") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.464443 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config" (OuterVolumeSpecName: "config") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.465713 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx" (OuterVolumeSpecName: "kube-api-access-945rx") pod "8da2028c-f296-4f44-b010-b3abec9f6b98" (UID: "8da2028c-f296-4f44-b010-b3abec9f6b98"). InnerVolumeSpecName "kube-api-access-945rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.466881 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.467835 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8da2028c-f296-4f44-b010-b3abec9f6b98" (UID: "8da2028c-f296-4f44-b010-b3abec9f6b98"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.468061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4" (OuterVolumeSpecName: "kube-api-access-zqhd4") pod "6adbe475-48f9-4ba3-82bd-b36bcd939168" (UID: "6adbe475-48f9-4ba3-82bd-b36bcd939168"). InnerVolumeSpecName "kube-api-access-zqhd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.470819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config" (OuterVolumeSpecName: "config") pod "8da2028c-f296-4f44-b010-b3abec9f6b98" (UID: "8da2028c-f296-4f44-b010-b3abec9f6b98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.564720 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-945rx\" (UniqueName: \"kubernetes.io/projected/8da2028c-f296-4f44-b010-b3abec9f6b98-kube-api-access-945rx\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565151 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565166 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565176 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/6adbe475-48f9-4ba3-82bd-b36bcd939168-kube-api-access-zqhd4\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565186 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8da2028c-f296-4f44-b010-b3abec9f6b98-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565196 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565207 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6adbe475-48f9-4ba3-82bd-b36bcd939168-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565217 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6adbe475-48f9-4ba3-82bd-b36bcd939168-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.565226 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da2028c-f296-4f44-b010-b3abec9f6b98-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.584987 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mzhb4"] Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.630753 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:00:31 crc kubenswrapper[4962]: W0220 10:00:31.651715 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d2cbc3_9bc4_4270_9d26_66c3e9189f8e.slice/crio-68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd WatchSource:0}: Error finding container 68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd: Status 404 returned error can't find the container with id 68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.928791 4962 generic.go:334] "Generic (PLEG): container finished" podID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerID="1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" exitCode=0 Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.928849 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" event={"ID":"8da2028c-f296-4f44-b010-b3abec9f6b98","Type":"ContainerDied","Data":"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.930117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" event={"ID":"8da2028c-f296-4f44-b010-b3abec9f6b98","Type":"ContainerDied","Data":"c9ca7261143890db86b7247b8197f46263302fc4c677314a7e1a1eadf9f9acf2"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.928870 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.930201 4962 scope.go:117] "RemoveContainer" containerID="1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.931452 4962 generic.go:334] "Generic (PLEG): container finished" podID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" containerID="9e4ff8bca8b9c2f6e4f08722be3898de4e9890a93fcdc65b9b078dd1d1fbdae2" exitCode=0 Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.931560 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" event={"ID":"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e","Type":"ContainerDied","Data":"9e4ff8bca8b9c2f6e4f08722be3898de4e9890a93fcdc65b9b078dd1d1fbdae2"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.931650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" event={"ID":"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e","Type":"ContainerStarted","Data":"68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.935569 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" event={"ID":"34e2f7a3-366d-4817-a502-720b5f9a782e","Type":"ContainerStarted","Data":"b90b44b5bb302c758985e233e37a6262525fb60ab2dc8f60a3090a1ae4aed5b5"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.935611 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" event={"ID":"34e2f7a3-366d-4817-a502-720b5f9a782e","Type":"ContainerStarted","Data":"e29275785e3fcebab481819ca8f24356f98fcfcafcdf2365bd92827c75e2546a"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.936157 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.937891 4962 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mzhb4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.937947 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" podUID="34e2f7a3-366d-4817-a502-720b5f9a782e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.941442 4962 generic.go:334] "Generic (PLEG): container finished" podID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerID="d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" exitCode=0 Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.941509 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" event={"ID":"6adbe475-48f9-4ba3-82bd-b36bcd939168","Type":"ContainerDied","Data":"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.941571 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" event={"ID":"6adbe475-48f9-4ba3-82bd-b36bcd939168","Type":"ContainerDied","Data":"3bea97da1320becf13fecaed38868cc74c4f54c7308979ccb795e3bbe8eacf06"} Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.941655 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-szbwm" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.968228 4962 scope.go:117] "RemoveContainer" containerID="1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" Feb 20 10:00:31 crc kubenswrapper[4962]: E0220 10:00:31.968870 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c\": container with ID starting with 1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c not found: ID does not exist" containerID="1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.968912 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c"} err="failed to get container status \"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c\": rpc error: code = NotFound desc = could not find container \"1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c\": container with ID starting with 1ad20ff5602887957aea361e6c7e57f3e5d8544a1efbf9eeddb8e6ace236468c not found: ID does not exist" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.968943 4962 scope.go:117] "RemoveContainer" containerID="d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.970859 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" podStartSLOduration=1.970845373 podStartE2EDuration="1.970845373s" podCreationTimestamp="2026-02-20 10:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:31.967837348 +0000 UTC m=+323.550309204" watchObservedRunningTime="2026-02-20 10:00:31.970845373 +0000 UTC m=+323.553317239" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.983444 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.983577 4962 scope.go:117] "RemoveContainer" containerID="d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" Feb 20 10:00:31 crc kubenswrapper[4962]: E0220 10:00:31.984160 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f\": container with ID starting with d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f not found: ID does not exist" containerID="d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.984235 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f"} err="failed to get container status \"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f\": rpc error: code = NotFound desc = could not find container \"d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f\": container with ID starting with d232e4bc30df660af52975b6ea9fe11ef1630883cc63947c2efb71dc69dd8c5f not found: ID does not exist" Feb 20 10:00:31 crc kubenswrapper[4962]: I0220 10:00:31.985635 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-758rq"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.000569 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.009359 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-szbwm"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.442301 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:00:32 crc kubenswrapper[4962]: E0220 10:00:32.442753 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerName="route-controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.442781 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerName="route-controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: E0220 10:00:32.442819 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.442833 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.442996 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" containerName="route-controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.443021 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" containerName="controller-manager" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.443762 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.447367 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.447658 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.447898 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.448138 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.448630 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.448958 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.452994 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.453124 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.453242 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.453321 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.453627 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.455734 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.456129 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.463267 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.469057 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.471123 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.475451 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580134 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580200 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580389 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580637 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580691 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580751 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580926 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.580945 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688379 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688451 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688496 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688518 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688644 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.688886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.690650 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.691068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.691020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.693982 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.696150 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.697079 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.698834 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.707729 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") pod \"route-controller-manager-5d74b7c87d-xgxtr\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.708102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") pod \"controller-manager-86787f5dd8-8s4p2\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.774655 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.784809 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:32 crc kubenswrapper[4962]: I0220 10:00:32.960742 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mzhb4" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.096575 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.150452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6adbe475-48f9-4ba3-82bd-b36bcd939168" path="/var/lib/kubelet/pods/6adbe475-48f9-4ba3-82bd-b36bcd939168/volumes" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.151640 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da2028c-f296-4f44-b010-b3abec9f6b98" path="/var/lib/kubelet/pods/8da2028c-f296-4f44-b010-b3abec9f6b98/volumes" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.250315 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:00:33 crc kubenswrapper[4962]: W0220 10:00:33.255310 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c2178fa_96db_4c48_bbb2_b4533bb86944.slice/crio-d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8 WatchSource:0}: Error finding container d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8: Status 404 returned error can't find the container with id d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8 Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.260113 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.398134 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") pod \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.398252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") pod \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.398301 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") pod \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\" (UID: \"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e\") " Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.399081 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume" (OuterVolumeSpecName: "config-volume") pod "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" (UID: "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.405355 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" (UID: "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.405793 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj" (OuterVolumeSpecName: "kube-api-access-279kj") pod "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" (UID: "d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e"). InnerVolumeSpecName "kube-api-access-279kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.499744 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.499792 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279kj\" (UniqueName: \"kubernetes.io/projected/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-kube-api-access-279kj\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.499808 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.960719 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" event={"ID":"2c2178fa-96db-4c48-bbb2-b4533bb86944","Type":"ContainerStarted","Data":"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.961286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" event={"ID":"2c2178fa-96db-4c48-bbb2-b4533bb86944","Type":"ContainerStarted","Data":"d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.962752 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.963915 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" event={"ID":"801d9c9e-28d3-49dc-9db6-9818197a563a","Type":"ContainerStarted","Data":"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.963944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" event={"ID":"801d9c9e-28d3-49dc-9db6-9818197a563a","Type":"ContainerStarted","Data":"6a10639fda43ab810b743ff2fde0f7850126d9a967a46f22e12d36e472e668d8"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.964472 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.968214 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" event={"ID":"d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e","Type":"ContainerDied","Data":"68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd"} Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.968253 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68941433de598e4b48b623f275a057620d1331d86c5d69b02bb668abde79b1fd" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.968283 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.971957 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.972521 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:00:33 crc kubenswrapper[4962]: I0220 10:00:33.983841 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" podStartSLOduration=3.983811706 podStartE2EDuration="3.983811706s" podCreationTimestamp="2026-02-20 10:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:33.982880166 +0000 UTC m=+325.565352012" watchObservedRunningTime="2026-02-20 10:00:33.983811706 +0000 UTC m=+325.566283562" Feb 20 10:00:34 crc kubenswrapper[4962]: I0220 10:00:34.061443 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" podStartSLOduration=4.061411144 podStartE2EDuration="4.061411144s" podCreationTimestamp="2026-02-20 10:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:34.058176352 +0000 UTC m=+325.640648198" watchObservedRunningTime="2026-02-20 10:00:34.061411144 +0000 UTC m=+325.643883000" Feb 20 10:00:34 crc kubenswrapper[4962]: I0220 10:00:34.755388 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:35 crc kubenswrapper[4962]: I0220 10:00:35.983902 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerName="controller-manager" containerID="cri-o://eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" gracePeriod=30 Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.452248 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.497457 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg"] Feb 20 10:00:36 crc kubenswrapper[4962]: E0220 10:00:36.497844 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" containerName="collect-profiles" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.497867 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" containerName="collect-profiles" Feb 20 10:00:36 crc kubenswrapper[4962]: E0220 10:00:36.497906 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerName="controller-manager" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.497921 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerName="controller-manager" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.498096 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerName="controller-manager" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.498118 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" containerName="collect-profiles" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.499005 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.509233 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg"] Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.562968 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.563050 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.563095 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.563238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.563268 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") pod \"801d9c9e-28d3-49dc-9db6-9818197a563a\" (UID: \"801d9c9e-28d3-49dc-9db6-9818197a563a\") " Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.564395 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.564838 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca" (OuterVolumeSpecName: "client-ca") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.565089 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config" (OuterVolumeSpecName: "config") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.573200 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.573800 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w" (OuterVolumeSpecName: "kube-api-access-jtp6w") pod "801d9c9e-28d3-49dc-9db6-9818197a563a" (UID: "801d9c9e-28d3-49dc-9db6-9818197a563a"). InnerVolumeSpecName "kube-api-access-jtp6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.665297 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5x5\" (UniqueName: \"kubernetes.io/projected/3d9d341c-6cc4-41ce-9d8c-2765a8950237-kube-api-access-sj5x5\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.665504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-client-ca\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.665566 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-proxy-ca-bundles\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.665649 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-config\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666073 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9d341c-6cc4-41ce-9d8c-2765a8950237-serving-cert\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666343 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666374 4962 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666404 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801d9c9e-28d3-49dc-9db6-9818197a563a-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666423 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/801d9c9e-28d3-49dc-9db6-9818197a563a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.666442 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtp6w\" (UniqueName: \"kubernetes.io/projected/801d9c9e-28d3-49dc-9db6-9818197a563a-kube-api-access-jtp6w\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767382 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-client-ca\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767439 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-proxy-ca-bundles\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-config\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9d341c-6cc4-41ce-9d8c-2765a8950237-serving-cert\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.767547 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5x5\" (UniqueName: \"kubernetes.io/projected/3d9d341c-6cc4-41ce-9d8c-2765a8950237-kube-api-access-sj5x5\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.769158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-proxy-ca-bundles\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.769761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-client-ca\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.770128 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9d341c-6cc4-41ce-9d8c-2765a8950237-config\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.773262 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9d341c-6cc4-41ce-9d8c-2765a8950237-serving-cert\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.788781 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5x5\" (UniqueName: \"kubernetes.io/projected/3d9d341c-6cc4-41ce-9d8c-2765a8950237-kube-api-access-sj5x5\") pod \"controller-manager-66f4cf4fb6-jv7mg\" (UID: \"3d9d341c-6cc4-41ce-9d8c-2765a8950237\") " pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.824915 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.992777 4962 generic.go:334] "Generic (PLEG): container finished" podID="801d9c9e-28d3-49dc-9db6-9818197a563a" containerID="eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" exitCode=0 Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.992833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" event={"ID":"801d9c9e-28d3-49dc-9db6-9818197a563a","Type":"ContainerDied","Data":"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4"} Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.992866 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" event={"ID":"801d9c9e-28d3-49dc-9db6-9818197a563a","Type":"ContainerDied","Data":"6a10639fda43ab810b743ff2fde0f7850126d9a967a46f22e12d36e472e668d8"} Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.992885 4962 scope.go:117] "RemoveContainer" containerID="eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" Feb 20 10:00:36 crc kubenswrapper[4962]: I0220 10:00:36.993013 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86787f5dd8-8s4p2" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.042557 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.046563 4962 scope.go:117] "RemoveContainer" containerID="eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.047815 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86787f5dd8-8s4p2"] Feb 20 10:00:37 crc kubenswrapper[4962]: E0220 10:00:37.048516 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4\": container with ID starting with eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4 not found: ID does not exist" containerID="eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.048555 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4"} err="failed to get container status \"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4\": rpc error: code = NotFound desc = could not find container \"eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4\": container with ID starting with eedc504647388e8e5117efb163c5340fa6883694c3aad3dcb9044b734dac65f4 not found: ID does not exist" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.148051 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801d9c9e-28d3-49dc-9db6-9818197a563a" path="/var/lib/kubelet/pods/801d9c9e-28d3-49dc-9db6-9818197a563a/volumes" Feb 20 10:00:37 crc kubenswrapper[4962]: I0220 10:00:37.342714 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg"] Feb 20 10:00:37 crc kubenswrapper[4962]: W0220 10:00:37.351385 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d9d341c_6cc4_41ce_9d8c_2765a8950237.slice/crio-966cbcd3db2e39b12a65ccae5ec7b69d63f96007c60e5e8653fe2d9ec63cddec WatchSource:0}: Error finding container 966cbcd3db2e39b12a65ccae5ec7b69d63f96007c60e5e8653fe2d9ec63cddec: Status 404 returned error can't find the container with id 966cbcd3db2e39b12a65ccae5ec7b69d63f96007c60e5e8653fe2d9ec63cddec Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.001010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" event={"ID":"3d9d341c-6cc4-41ce-9d8c-2765a8950237","Type":"ContainerStarted","Data":"b9a969d7ecc361cbf8df0410aa0c35284370b33497d064c68a5a69fd6cf61662"} Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.001095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" event={"ID":"3d9d341c-6cc4-41ce-9d8c-2765a8950237","Type":"ContainerStarted","Data":"966cbcd3db2e39b12a65ccae5ec7b69d63f96007c60e5e8653fe2d9ec63cddec"} Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.003658 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.013508 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.029408 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66f4cf4fb6-jv7mg" podStartSLOduration=4.029387548 podStartE2EDuration="4.029387548s" podCreationTimestamp="2026-02-20 10:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:38.025958149 +0000 UTC m=+329.608429995" watchObservedRunningTime="2026-02-20 10:00:38.029387548 +0000 UTC m=+329.611859394" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.230390 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lllvp"] Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.231258 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.252021 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lllvp"] Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391680 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-tls\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391750 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-certificates\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391777 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391830 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-bound-sa-token\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4b4\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-kube-api-access-gq4b4\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391923 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.391959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-trusted-ca\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.417769 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.493689 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494231 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-bound-sa-token\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494288 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4b4\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-kube-api-access-gq4b4\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494375 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-trusted-ca\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-tls\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494460 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-certificates\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.494684 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.495938 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-trusted-ca\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.496200 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-certificates\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.502300 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-registry-tls\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.502997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.515096 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-bound-sa-token\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.522495 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4b4\" (UniqueName: \"kubernetes.io/projected/e32b10f1-4fdb-4320-a7d9-6f70bbdc0929-kube-api-access-gq4b4\") pod \"image-registry-66df7c8f76-lllvp\" (UID: \"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929\") " pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:38 crc kubenswrapper[4962]: I0220 10:00:38.594693 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:39 crc kubenswrapper[4962]: I0220 10:00:39.053338 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lllvp"] Feb 20 10:00:39 crc kubenswrapper[4962]: W0220 10:00:39.062798 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32b10f1_4fdb_4320_a7d9_6f70bbdc0929.slice/crio-63d799e1986d6c1df11ddfabeed19035418b8a3437eb932b329718727863cfb5 WatchSource:0}: Error finding container 63d799e1986d6c1df11ddfabeed19035418b8a3437eb932b329718727863cfb5: Status 404 returned error can't find the container with id 63d799e1986d6c1df11ddfabeed19035418b8a3437eb932b329718727863cfb5 Feb 20 10:00:40 crc kubenswrapper[4962]: I0220 10:00:40.020673 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" event={"ID":"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929","Type":"ContainerStarted","Data":"59cb24cb15e07642645b817d540415fe3edf712b8073107df5e2dc5eac04ab0e"} Feb 20 10:00:40 crc kubenswrapper[4962]: I0220 10:00:40.021163 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:40 crc kubenswrapper[4962]: I0220 10:00:40.021181 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" event={"ID":"e32b10f1-4fdb-4320-a7d9-6f70bbdc0929","Type":"ContainerStarted","Data":"63d799e1986d6c1df11ddfabeed19035418b8a3437eb932b329718727863cfb5"} Feb 20 10:00:40 crc kubenswrapper[4962]: I0220 10:00:40.070250 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" podStartSLOduration=2.070213669 podStartE2EDuration="2.070213669s" podCreationTimestamp="2026-02-20 10:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:00:40.063295969 +0000 UTC m=+331.645767835" watchObservedRunningTime="2026-02-20 10:00:40.070213669 +0000 UTC m=+331.652685555" Feb 20 10:00:41 crc kubenswrapper[4962]: I0220 10:00:41.508553 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:00:41 crc kubenswrapper[4962]: I0220 10:00:41.509243 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.780830 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sl4km"] Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.782184 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.784865 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.797695 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sl4km"] Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.903476 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-utilities\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.903759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hrfs\" (UniqueName: \"kubernetes.io/projected/0e92c119-6503-4fc1-b607-0d41d821e8fe-kube-api-access-6hrfs\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.904068 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-catalog-content\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.981624 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j9hxw"] Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.982952 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.986812 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 10:00:50 crc kubenswrapper[4962]: I0220 10:00:50.994548 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9hxw"] Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.005045 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-catalog-content\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.005105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-utilities\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.005147 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hrfs\" (UniqueName: \"kubernetes.io/projected/0e92c119-6503-4fc1-b607-0d41d821e8fe-kube-api-access-6hrfs\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.005965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-catalog-content\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.006080 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e92c119-6503-4fc1-b607-0d41d821e8fe-utilities\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.033937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hrfs\" (UniqueName: \"kubernetes.io/projected/0e92c119-6503-4fc1-b607-0d41d821e8fe-kube-api-access-6hrfs\") pod \"redhat-marketplace-sl4km\" (UID: \"0e92c119-6503-4fc1-b607-0d41d821e8fe\") " pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.106737 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-utilities\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.106787 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-catalog-content\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.106813 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw5bk\" (UniqueName: \"kubernetes.io/projected/82f8db6b-4715-42f3-a705-821af9e03156-kube-api-access-lw5bk\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.143876 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.209099 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-utilities\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.209188 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-catalog-content\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.209249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw5bk\" (UniqueName: \"kubernetes.io/projected/82f8db6b-4715-42f3-a705-821af9e03156-kube-api-access-lw5bk\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.211089 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-utilities\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.211109 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82f8db6b-4715-42f3-a705-821af9e03156-catalog-content\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.240126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw5bk\" (UniqueName: \"kubernetes.io/projected/82f8db6b-4715-42f3-a705-821af9e03156-kube-api-access-lw5bk\") pod \"redhat-operators-j9hxw\" (UID: \"82f8db6b-4715-42f3-a705-821af9e03156\") " pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.297259 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.597527 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sl4km"] Feb 20 10:00:51 crc kubenswrapper[4962]: W0220 10:00:51.602912 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e92c119_6503_4fc1_b607_0d41d821e8fe.slice/crio-1857f27633858ee0c5e1a842d7a2fe0ec51450fe6fc9bd8681879479cae311e4 WatchSource:0}: Error finding container 1857f27633858ee0c5e1a842d7a2fe0ec51450fe6fc9bd8681879479cae311e4: Status 404 returned error can't find the container with id 1857f27633858ee0c5e1a842d7a2fe0ec51450fe6fc9bd8681879479cae311e4 Feb 20 10:00:51 crc kubenswrapper[4962]: I0220 10:00:51.701111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9hxw"] Feb 20 10:00:51 crc kubenswrapper[4962]: W0220 10:00:51.707941 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82f8db6b_4715_42f3_a705_821af9e03156.slice/crio-4426c63fbc54fbb5e8817d622150574fccbd11ccca180ef6d535d5f7774da999 WatchSource:0}: Error finding container 4426c63fbc54fbb5e8817d622150574fccbd11ccca180ef6d535d5f7774da999: Status 404 returned error can't find the container with id 4426c63fbc54fbb5e8817d622150574fccbd11ccca180ef6d535d5f7774da999 Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.107729 4962 generic.go:334] "Generic (PLEG): container finished" podID="0e92c119-6503-4fc1-b607-0d41d821e8fe" containerID="75b1b6573c381433cde8a7b8d56c45183daa17d6959a43902cbb5b72476056fc" exitCode=0 Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.107796 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl4km" event={"ID":"0e92c119-6503-4fc1-b607-0d41d821e8fe","Type":"ContainerDied","Data":"75b1b6573c381433cde8a7b8d56c45183daa17d6959a43902cbb5b72476056fc"} Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.108138 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl4km" event={"ID":"0e92c119-6503-4fc1-b607-0d41d821e8fe","Type":"ContainerStarted","Data":"1857f27633858ee0c5e1a842d7a2fe0ec51450fe6fc9bd8681879479cae311e4"} Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.112783 4962 generic.go:334] "Generic (PLEG): container finished" podID="82f8db6b-4715-42f3-a705-821af9e03156" containerID="62f673ffb2ae5bb54aa9b1b375a914a7b13750b5fd842a4abf93abeb1bbb0f43" exitCode=0 Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.112831 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerDied","Data":"62f673ffb2ae5bb54aa9b1b375a914a7b13750b5fd842a4abf93abeb1bbb0f43"} Feb 20 10:00:52 crc kubenswrapper[4962]: I0220 10:00:52.112867 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerStarted","Data":"4426c63fbc54fbb5e8817d622150574fccbd11ccca180ef6d535d5f7774da999"} Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.121166 4962 generic.go:334] "Generic (PLEG): container finished" podID="0e92c119-6503-4fc1-b607-0d41d821e8fe" containerID="768ffc49e7fd2e11d9b986aa928149b738fe25633e0327c75baf58497f044efd" exitCode=0 Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.121747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl4km" event={"ID":"0e92c119-6503-4fc1-b607-0d41d821e8fe","Type":"ContainerDied","Data":"768ffc49e7fd2e11d9b986aa928149b738fe25633e0327c75baf58497f044efd"} Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.176040 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerStarted","Data":"5b3dc93152a6aa3e4d4bdd5da4281efd77087dffbfeb56667dea0c10ca907dd6"} Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.194425 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.195618 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.210089 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.229845 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.345871 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.345987 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.346043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.387739 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lsx57"] Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.389173 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.391921 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.393021 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsx57"] Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.447557 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.447638 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.447715 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.448017 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.449067 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.468176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") pod \"certified-operators-v7zlm\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.537640 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.550351 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-utilities\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.550411 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7bf\" (UniqueName: \"kubernetes.io/projected/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-kube-api-access-bg7bf\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.550762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-catalog-content\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.652623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-catalog-content\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.653064 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-utilities\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.653092 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7bf\" (UniqueName: \"kubernetes.io/projected/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-kube-api-access-bg7bf\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.653474 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-catalog-content\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.653952 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-utilities\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.672660 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7bf\" (UniqueName: \"kubernetes.io/projected/16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd-kube-api-access-bg7bf\") pod \"community-operators-lsx57\" (UID: \"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd\") " pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.810830 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:00:53 crc kubenswrapper[4962]: I0220 10:00:53.985028 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 10:00:53 crc kubenswrapper[4962]: W0220 10:00:53.990320 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63e8904_d4b9_405f_94a1_f44cb565b3e7.slice/crio-8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70 WatchSource:0}: Error finding container 8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70: Status 404 returned error can't find the container with id 8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70 Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.185791 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl4km" event={"ID":"0e92c119-6503-4fc1-b607-0d41d821e8fe","Type":"ContainerStarted","Data":"f151c6c6ae9163db4285b46a590a250b44e12af14e3ee33e2dacf43e63c1f999"} Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.187508 4962 generic.go:334] "Generic (PLEG): container finished" podID="82f8db6b-4715-42f3-a705-821af9e03156" containerID="5b3dc93152a6aa3e4d4bdd5da4281efd77087dffbfeb56667dea0c10ca907dd6" exitCode=0 Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.187594 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerDied","Data":"5b3dc93152a6aa3e4d4bdd5da4281efd77087dffbfeb56667dea0c10ca907dd6"} Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.189470 4962 generic.go:334] "Generic (PLEG): container finished" podID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerID="0dadcfbc9540e03300cad39a3785cea06d52c326b71fdd2b10314f009df918de" exitCode=0 Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.189511 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerDied","Data":"0dadcfbc9540e03300cad39a3785cea06d52c326b71fdd2b10314f009df918de"} Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.189541 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerStarted","Data":"8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70"} Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.214519 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sl4km" podStartSLOduration=2.713011942 podStartE2EDuration="4.214493209s" podCreationTimestamp="2026-02-20 10:00:50 +0000 UTC" firstStartedPulling="2026-02-20 10:00:52.10919236 +0000 UTC m=+343.691664226" lastFinishedPulling="2026-02-20 10:00:53.610673647 +0000 UTC m=+345.193145493" observedRunningTime="2026-02-20 10:00:54.211524304 +0000 UTC m=+345.793996160" watchObservedRunningTime="2026-02-20 10:00:54.214493209 +0000 UTC m=+345.796965065" Feb 20 10:00:54 crc kubenswrapper[4962]: I0220 10:00:54.241785 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsx57"] Feb 20 10:00:54 crc kubenswrapper[4962]: W0220 10:00:54.245951 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16b4ee5b_87f1_4b91_abd0_d2a7eb56e7bd.slice/crio-f5ac4dc3c6b8ac793d2e417c77fdfae4619ef7c03650d5b8771f8bfb483d0c5b WatchSource:0}: Error finding container f5ac4dc3c6b8ac793d2e417c77fdfae4619ef7c03650d5b8771f8bfb483d0c5b: Status 404 returned error can't find the container with id f5ac4dc3c6b8ac793d2e417c77fdfae4619ef7c03650d5b8771f8bfb483d0c5b Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.200450 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9hxw" event={"ID":"82f8db6b-4715-42f3-a705-821af9e03156","Type":"ContainerStarted","Data":"63089f43c76a2f21cbce4a8383f7399bc851fc4bd66badd6f54366ce687b26c8"} Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.203476 4962 generic.go:334] "Generic (PLEG): container finished" podID="16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd" containerID="05f8b028cb076a04d157dea166964f8cd1ad3e2d4cb618244e94ea9a474f4dbd" exitCode=0 Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.203570 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsx57" event={"ID":"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd","Type":"ContainerDied","Data":"05f8b028cb076a04d157dea166964f8cd1ad3e2d4cb618244e94ea9a474f4dbd"} Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.203657 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsx57" event={"ID":"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd","Type":"ContainerStarted","Data":"f5ac4dc3c6b8ac793d2e417c77fdfae4619ef7c03650d5b8771f8bfb483d0c5b"} Feb 20 10:00:55 crc kubenswrapper[4962]: I0220 10:00:55.218645 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j9hxw" podStartSLOduration=2.7043437150000003 podStartE2EDuration="5.218626336s" podCreationTimestamp="2026-02-20 10:00:50 +0000 UTC" firstStartedPulling="2026-02-20 10:00:52.115625304 +0000 UTC m=+343.698097150" lastFinishedPulling="2026-02-20 10:00:54.629907925 +0000 UTC m=+346.212379771" observedRunningTime="2026-02-20 10:00:55.216982625 +0000 UTC m=+346.799454471" watchObservedRunningTime="2026-02-20 10:00:55.218626336 +0000 UTC m=+346.801098182" Feb 20 10:00:56 crc kubenswrapper[4962]: I0220 10:00:56.211736 4962 generic.go:334] "Generic (PLEG): container finished" podID="16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd" containerID="3cd190a0c2fc6a4735fdad7f6ba7fdb0f7749184990d46ef96f36b2f1568c407" exitCode=0 Feb 20 10:00:56 crc kubenswrapper[4962]: I0220 10:00:56.211815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsx57" event={"ID":"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd","Type":"ContainerDied","Data":"3cd190a0c2fc6a4735fdad7f6ba7fdb0f7749184990d46ef96f36b2f1568c407"} Feb 20 10:00:56 crc kubenswrapper[4962]: I0220 10:00:56.213877 4962 generic.go:334] "Generic (PLEG): container finished" podID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerID="4a3a18f226977c3365d615b122597c40567fbf4037342852763a1652d9c44e94" exitCode=0 Feb 20 10:00:56 crc kubenswrapper[4962]: I0220 10:00:56.213935 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerDied","Data":"4a3a18f226977c3365d615b122597c40567fbf4037342852763a1652d9c44e94"} Feb 20 10:00:57 crc kubenswrapper[4962]: I0220 10:00:57.219954 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerStarted","Data":"89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab"} Feb 20 10:00:57 crc kubenswrapper[4962]: I0220 10:00:57.223020 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsx57" event={"ID":"16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd","Type":"ContainerStarted","Data":"7137b7222dc34e33b4b934560f10e0f159557bbaac8ab7183eaa47b62b1b2dd5"} Feb 20 10:00:57 crc kubenswrapper[4962]: I0220 10:00:57.245124 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v7zlm" podStartSLOduration=1.842846029 podStartE2EDuration="4.245103951s" podCreationTimestamp="2026-02-20 10:00:53 +0000 UTC" firstStartedPulling="2026-02-20 10:00:54.190718323 +0000 UTC m=+345.773190179" lastFinishedPulling="2026-02-20 10:00:56.592976255 +0000 UTC m=+348.175448101" observedRunningTime="2026-02-20 10:00:57.2409529 +0000 UTC m=+348.823424746" watchObservedRunningTime="2026-02-20 10:00:57.245103951 +0000 UTC m=+348.827575797" Feb 20 10:00:57 crc kubenswrapper[4962]: I0220 10:00:57.263817 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lsx57" podStartSLOduration=2.860559978 podStartE2EDuration="4.263798895s" podCreationTimestamp="2026-02-20 10:00:53 +0000 UTC" firstStartedPulling="2026-02-20 10:00:55.204969503 +0000 UTC m=+346.787441349" lastFinishedPulling="2026-02-20 10:00:56.60820842 +0000 UTC m=+348.190680266" observedRunningTime="2026-02-20 10:00:57.260417998 +0000 UTC m=+348.842889844" watchObservedRunningTime="2026-02-20 10:00:57.263798895 +0000 UTC m=+348.846270741" Feb 20 10:00:58 crc kubenswrapper[4962]: I0220 10:00:58.602791 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lllvp" Feb 20 10:00:58 crc kubenswrapper[4962]: I0220 10:00:58.678441 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.145230 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.145597 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.181412 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.292521 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sl4km" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.298440 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.298481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:01:01 crc kubenswrapper[4962]: I0220 10:01:01.348427 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:01:02 crc kubenswrapper[4962]: I0220 10:01:02.313107 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j9hxw" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.538053 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.538481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.612417 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.811732 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.811791 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:01:03 crc kubenswrapper[4962]: I0220 10:01:03.876946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:01:04 crc kubenswrapper[4962]: I0220 10:01:04.320942 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 10:01:04 crc kubenswrapper[4962]: I0220 10:01:04.329835 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lsx57" Feb 20 10:01:11 crc kubenswrapper[4962]: I0220 10:01:11.508266 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:01:11 crc kubenswrapper[4962]: I0220 10:01:11.508742 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:01:23 crc kubenswrapper[4962]: I0220 10:01:23.721516 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerName="registry" containerID="cri-o://c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" gracePeriod=30 Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.239939 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333100 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333320 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333369 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333385 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.333472 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.334120 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.334151 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") pod \"b4ad1819-20e1-406b-8499-5a73780c0a0c\" (UID: \"b4ad1819-20e1-406b-8499-5a73780c0a0c\") " Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.334392 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.335141 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.339363 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.340105 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.340664 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm" (OuterVolumeSpecName: "kube-api-access-gmtxm") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "kube-api-access-gmtxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.340957 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.347798 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.360759 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b4ad1819-20e1-406b-8499-5a73780c0a0c" (UID: "b4ad1819-20e1-406b-8499-5a73780c0a0c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.373747 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerID="c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" exitCode=0 Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.373797 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.373818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" event={"ID":"b4ad1819-20e1-406b-8499-5a73780c0a0c","Type":"ContainerDied","Data":"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46"} Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.374143 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8pks8" event={"ID":"b4ad1819-20e1-406b-8499-5a73780c0a0c","Type":"ContainerDied","Data":"5004d974da71f7174ba7d6f42652143c4f7cb0b752e3647e653cb9e55b56d9b3"} Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.374176 4962 scope.go:117] "RemoveContainer" containerID="c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.402639 4962 scope.go:117] "RemoveContainer" containerID="c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" Feb 20 10:01:24 crc kubenswrapper[4962]: E0220 10:01:24.403551 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46\": container with ID starting with c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46 not found: ID does not exist" containerID="c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.403803 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46"} err="failed to get container status \"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46\": rpc error: code = NotFound desc = could not find container \"c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46\": container with ID starting with c741eb823ccf8c4784adbc060b958f08884c73108a1831a19813a3f4b3898e46 not found: ID does not exist" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.418149 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.424961 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8pks8"] Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.435910 4962 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b4ad1819-20e1-406b-8499-5a73780c0a0c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436068 4962 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b4ad1819-20e1-406b-8499-5a73780c0a0c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436177 4962 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436312 4962 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436404 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436490 4962 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b4ad1819-20e1-406b-8499-5a73780c0a0c-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:24 crc kubenswrapper[4962]: I0220 10:01:24.436623 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmtxm\" (UniqueName: \"kubernetes.io/projected/b4ad1819-20e1-406b-8499-5a73780c0a0c-kube-api-access-gmtxm\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:25 crc kubenswrapper[4962]: I0220 10:01:25.153591 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" path="/var/lib/kubelet/pods/b4ad1819-20e1-406b-8499-5a73780c0a0c/volumes" Feb 20 10:01:34 crc kubenswrapper[4962]: I0220 10:01:34.736287 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:01:34 crc kubenswrapper[4962]: I0220 10:01:34.737499 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerName="route-controller-manager" containerID="cri-o://4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" gracePeriod=30 Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.138146 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.314699 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") pod \"2c2178fa-96db-4c48-bbb2-b4533bb86944\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.314925 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") pod \"2c2178fa-96db-4c48-bbb2-b4533bb86944\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.315066 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") pod \"2c2178fa-96db-4c48-bbb2-b4533bb86944\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.315829 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") pod \"2c2178fa-96db-4c48-bbb2-b4533bb86944\" (UID: \"2c2178fa-96db-4c48-bbb2-b4533bb86944\") " Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.316703 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c2178fa-96db-4c48-bbb2-b4533bb86944" (UID: "2c2178fa-96db-4c48-bbb2-b4533bb86944"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.316731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config" (OuterVolumeSpecName: "config") pod "2c2178fa-96db-4c48-bbb2-b4533bb86944" (UID: "2c2178fa-96db-4c48-bbb2-b4533bb86944"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.321719 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c2178fa-96db-4c48-bbb2-b4533bb86944" (UID: "2c2178fa-96db-4c48-bbb2-b4533bb86944"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.322624 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk" (OuterVolumeSpecName: "kube-api-access-hv9rk") pod "2c2178fa-96db-4c48-bbb2-b4533bb86944" (UID: "2c2178fa-96db-4c48-bbb2-b4533bb86944"). InnerVolumeSpecName "kube-api-access-hv9rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.417944 4962 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2178fa-96db-4c48-bbb2-b4533bb86944-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.418028 4962 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.418051 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv9rk\" (UniqueName: \"kubernetes.io/projected/2c2178fa-96db-4c48-bbb2-b4533bb86944-kube-api-access-hv9rk\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.418077 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2178fa-96db-4c48-bbb2-b4533bb86944-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447123 4962 generic.go:334] "Generic (PLEG): container finished" podID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerID="4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" exitCode=0 Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" event={"ID":"2c2178fa-96db-4c48-bbb2-b4533bb86944","Type":"ContainerDied","Data":"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5"} Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447315 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr" event={"ID":"2c2178fa-96db-4c48-bbb2-b4533bb86944","Type":"ContainerDied","Data":"d2adb538934c40a54615b86e80a9725c5a492f27096c6f2895982f89652fdbc8"} Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.447399 4962 scope.go:117] "RemoveContainer" containerID="4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.478444 4962 scope.go:117] "RemoveContainer" containerID="4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" Feb 20 10:01:35 crc kubenswrapper[4962]: E0220 10:01:35.479216 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5\": container with ID starting with 4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5 not found: ID does not exist" containerID="4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.479312 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5"} err="failed to get container status \"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5\": rpc error: code = NotFound desc = could not find container \"4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5\": container with ID starting with 4a00abb89a1826b90b36968959f8f06817d338450f47c0747609b7ea230e6ab5 not found: ID does not exist" Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.496504 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:01:35 crc kubenswrapper[4962]: I0220 10:01:35.503121 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-xgxtr"] Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484433 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8"] Feb 20 10:01:36 crc kubenswrapper[4962]: E0220 10:01:36.484690 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerName="registry" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484706 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerName="registry" Feb 20 10:01:36 crc kubenswrapper[4962]: E0220 10:01:36.484730 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerName="route-controller-manager" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484737 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerName="route-controller-manager" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484851 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" containerName="route-controller-manager" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.484859 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ad1819-20e1-406b-8499-5a73780c0a0c" containerName="registry" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.485231 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.488060 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.488791 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.488982 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.489065 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.488989 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.491109 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.498616 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8"] Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.634208 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-client-ca\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.634372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a393401-cb35-4a65-9be1-cb3956d6b44a-serving-cert\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.634472 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-config\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.634537 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5g2\" (UniqueName: \"kubernetes.io/projected/7a393401-cb35-4a65-9be1-cb3956d6b44a-kube-api-access-zs5g2\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.736872 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5g2\" (UniqueName: \"kubernetes.io/projected/7a393401-cb35-4a65-9be1-cb3956d6b44a-kube-api-access-zs5g2\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.736991 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-client-ca\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.737059 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a393401-cb35-4a65-9be1-cb3956d6b44a-serving-cert\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.737120 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-config\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.739161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-client-ca\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.739280 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a393401-cb35-4a65-9be1-cb3956d6b44a-config\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.752847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a393401-cb35-4a65-9be1-cb3956d6b44a-serving-cert\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.770203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5g2\" (UniqueName: \"kubernetes.io/projected/7a393401-cb35-4a65-9be1-cb3956d6b44a-kube-api-access-zs5g2\") pod \"route-controller-manager-7489547554-4cvq8\" (UID: \"7a393401-cb35-4a65-9be1-cb3956d6b44a\") " pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:36 crc kubenswrapper[4962]: I0220 10:01:36.810214 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.081933 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8"] Feb 20 10:01:37 crc kubenswrapper[4962]: W0220 10:01:37.093133 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a393401_cb35_4a65_9be1_cb3956d6b44a.slice/crio-2b98ebe61556d954740f52fcd08e56f5013a6769dc73cf8badb3f9129c05b9d6 WatchSource:0}: Error finding container 2b98ebe61556d954740f52fcd08e56f5013a6769dc73cf8badb3f9129c05b9d6: Status 404 returned error can't find the container with id 2b98ebe61556d954740f52fcd08e56f5013a6769dc73cf8badb3f9129c05b9d6 Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.146342 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2178fa-96db-4c48-bbb2-b4533bb86944" path="/var/lib/kubelet/pods/2c2178fa-96db-4c48-bbb2-b4533bb86944/volumes" Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.471826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" event={"ID":"7a393401-cb35-4a65-9be1-cb3956d6b44a","Type":"ContainerStarted","Data":"057587bac126991d0fab9a799fe34d5777fe6a028016633e8281ad0b5a6efe21"} Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.471871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" event={"ID":"7a393401-cb35-4a65-9be1-cb3956d6b44a","Type":"ContainerStarted","Data":"2b98ebe61556d954740f52fcd08e56f5013a6769dc73cf8badb3f9129c05b9d6"} Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.472185 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.497340 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" podStartSLOduration=3.497321296 podStartE2EDuration="3.497321296s" podCreationTimestamp="2026-02-20 10:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:01:37.496364236 +0000 UTC m=+389.078836122" watchObservedRunningTime="2026-02-20 10:01:37.497321296 +0000 UTC m=+389.079793142" Feb 20 10:01:37 crc kubenswrapper[4962]: I0220 10:01:37.623633 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7489547554-4cvq8" Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.508239 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.509092 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.510416 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.511773 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:01:41 crc kubenswrapper[4962]: I0220 10:01:41.511886 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa" gracePeriod=600 Feb 20 10:01:42 crc kubenswrapper[4962]: I0220 10:01:42.515117 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa" exitCode=0 Feb 20 10:01:42 crc kubenswrapper[4962]: I0220 10:01:42.515244 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa"} Feb 20 10:01:42 crc kubenswrapper[4962]: I0220 10:01:42.515792 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838"} Feb 20 10:01:42 crc kubenswrapper[4962]: I0220 10:01:42.515831 4962 scope.go:117] "RemoveContainer" containerID="dffe20e45069ebd85e9ad49e365ac5180c1e43d02340190615a38900f527e432" Feb 20 10:03:41 crc kubenswrapper[4962]: I0220 10:03:41.508722 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:03:41 crc kubenswrapper[4962]: I0220 10:03:41.509828 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:11 crc kubenswrapper[4962]: I0220 10:04:11.508842 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:04:11 crc kubenswrapper[4962]: I0220 10:04:11.509534 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.508863 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.510880 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.510974 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.511620 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.511680 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838" gracePeriod=600 Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.770579 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838" exitCode=0 Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.770642 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838"} Feb 20 10:04:41 crc kubenswrapper[4962]: I0220 10:04:41.770676 4962 scope.go:117] "RemoveContainer" containerID="28f57d6af11459adc6bb1afb41198ef7b8d5795fd383a2c166570a156f5d42fa" Feb 20 10:04:42 crc kubenswrapper[4962]: I0220 10:04:42.778769 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14"} Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.900186 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-99b2s"] Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901581 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-controller" containerID="cri-o://2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901787 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="northd" containerID="cri-o://582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901735 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="nbdb" containerID="cri-o://50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901843 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901895 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-node" containerID="cri-o://03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901939 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-acl-logging" containerID="cri-o://36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.901853 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="sbdb" containerID="cri-o://195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" gracePeriod=30 Feb 20 10:06:19 crc kubenswrapper[4962]: I0220 10:06:19.947054 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" containerID="cri-o://632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" gracePeriod=30 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.257211 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.259871 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovn-acl-logging/0.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.260239 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovn-controller/0.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.260574 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.330778 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z6xc2"] Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331292 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="nbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331372 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="nbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331432 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331480 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331523 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331570 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331649 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-node" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331701 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-node" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331769 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331844 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.331897 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-acl-logging" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.331946 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-acl-logging" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332000 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kubecfg-setup" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332055 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kubecfg-setup" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332105 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="northd" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332154 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="northd" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332280 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332336 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332684 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="sbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332755 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="sbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.332811 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.332860 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333100 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-node" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333160 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="sbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333212 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333257 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-acl-logging" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333307 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333359 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="nbdb" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333410 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333482 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="northd" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333630 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333690 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333738 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.333799 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovn-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.333961 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.334015 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.334243 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.334297 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" containerName="ovnkube-controller" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.335907 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361484 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovn-node-metrics-cert\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361539 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgsdk\" (UniqueName: \"kubernetes.io/projected/e12256fd-84a5-4a79-b750-20b5a64bd4c9-kube-api-access-jgsdk\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361579 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-script-lib\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361639 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361672 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-ovn\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361698 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-netns\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-slash\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361776 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-env-overrides\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361812 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-config\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361936 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-systemd-units\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.361994 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-etc-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-bin\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362207 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362265 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-netd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-systemd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362403 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-log-socket\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362451 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-kubelet\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362500 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-var-lib-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.362548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-node-log\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.458239 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/2.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.459005 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/1.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.459093 4962 generic.go:334] "Generic (PLEG): container finished" podID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" containerID="1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef" exitCode=2 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.459194 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerDied","Data":"1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.459261 4962 scope.go:117] "RemoveContainer" containerID="330fcac483de40973468483bb1e7d1a3978f3e5fb4144bc0efaa58cf02e30e67" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.460015 4962 scope.go:117] "RemoveContainer" containerID="1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.460311 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wqwgj_openshift-multus(1957ac70-30f9-48c2-a82b-72aa3b7a883a)\"" pod="openshift-multus/multus-wqwgj" podUID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.461710 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovnkube-controller/3.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.462880 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.462928 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.462990 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463026 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463110 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463902 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463991 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.463972 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.464107 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.464794 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465716 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465810 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465857 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465908 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.465947 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466003 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466104 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466125 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466172 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466217 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash" (OuterVolumeSpecName: "host-slash") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466271 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466314 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466367 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466404 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") pod \"2abd2b70-bb78-49a0-b930-cd066384e803\" (UID: \"2abd2b70-bb78-49a0-b930-cd066384e803\") " Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-env-overrides\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466732 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-config\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-systemd-units\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466885 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-etc-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466949 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467070 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-bin\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467115 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-netd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467226 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-systemd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-log-socket\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467307 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-kubelet\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467341 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-var-lib-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467408 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-node-log\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467455 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovn-node-metrics-cert\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467489 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgsdk\" (UniqueName: \"kubernetes.io/projected/e12256fd-84a5-4a79-b750-20b5a64bd4c9-kube-api-access-jgsdk\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-script-lib\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467619 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-netns\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467691 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-ovn\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467728 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-slash\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467812 4962 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467833 4962 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467853 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467874 4962 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467894 4962 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467913 4962 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467933 4962 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.467951 4962 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-slash\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468031 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-slash\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468733 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-bin\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468823 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-systemd-units\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468995 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovn-acl-logging/0.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469173 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-etc-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469547 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-cni-netd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469563 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-config\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469624 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-systemd\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469671 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-log-socket\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-kubelet\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469758 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-var-lib-openvswitch\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-node-log\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470145 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-env-overrides\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470163 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-99b2s_2abd2b70-bb78-49a0-b930-cd066384e803/ovn-controller/0.log" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.466293 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468104 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468137 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468167 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log" (OuterVolumeSpecName: "node-log") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.468195 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469633 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket" (OuterVolumeSpecName: "log-socket") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469658 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.469680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470336 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-run-netns\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470380 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470409 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.470487 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e12256fd-84a5-4a79-b750-20b5a64bd4c9-run-ovn\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.471863 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovnkube-script-lib\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472117 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472161 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472195 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472170 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472317 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472422 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472445 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472464 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" exitCode=0 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472482 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" exitCode=143 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472549 4962 generic.go:334] "Generic (PLEG): container finished" podID="2abd2b70-bb78-49a0-b930-cd066384e803" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" exitCode=143 Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472584 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472657 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472677 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472691 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472703 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472715 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472726 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472738 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472749 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472760 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472772 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472807 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472821 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472832 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472843 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472855 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472866 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472877 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472891 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472904 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472915 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472948 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472961 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472973 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472985 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.472996 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473007 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473019 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473030 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473041 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473055 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473071 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" event={"ID":"2abd2b70-bb78-49a0-b930-cd066384e803","Type":"ContainerDied","Data":"30d1769bf1e4a85341ca0d75e37166ad7a768dbf64ad246e32c8fde99616e4b7"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473091 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473105 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473117 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473128 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473140 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473151 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473162 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473173 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473185 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473196 4962 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.473899 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e12256fd-84a5-4a79-b750-20b5a64bd4c9-ovn-node-metrics-cert\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.476400 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-99b2s" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.478686 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt" (OuterVolumeSpecName: "kube-api-access-85mbt") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "kube-api-access-85mbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.497870 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.504432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgsdk\" (UniqueName: \"kubernetes.io/projected/e12256fd-84a5-4a79-b750-20b5a64bd4c9-kube-api-access-jgsdk\") pod \"ovnkube-node-z6xc2\" (UID: \"e12256fd-84a5-4a79-b750-20b5a64bd4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.504515 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2abd2b70-bb78-49a0-b930-cd066384e803" (UID: "2abd2b70-bb78-49a0-b930-cd066384e803"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.526783 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.547335 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569201 4962 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569451 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85mbt\" (UniqueName: \"kubernetes.io/projected/2abd2b70-bb78-49a0-b930-cd066384e803-kube-api-access-85mbt\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569618 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2abd2b70-bb78-49a0-b930-cd066384e803-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569721 4962 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569838 4962 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569938 4962 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569958 4962 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-node-log\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569974 4962 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.569989 4962 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2abd2b70-bb78-49a0-b930-cd066384e803-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.570004 4962 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-log-socket\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.570016 4962 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.570029 4962 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2abd2b70-bb78-49a0-b930-cd066384e803-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.574154 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.600722 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.622420 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.646758 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.654395 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.665559 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.684987 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.715070 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.735676 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.736226 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.736352 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} err="failed to get container status \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.736454 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.736970 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737013 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} err="failed to get container status \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737045 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.737391 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737421 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} err="failed to get container status \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737442 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.737822 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737884 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} err="failed to get container status \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.737923 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.738341 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.738379 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} err="failed to get container status \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.738402 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.738747 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.738778 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} err="failed to get container status \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.738798 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.739081 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739109 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} err="failed to get container status \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739126 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.739404 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739429 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} err="failed to get container status \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739452 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.739747 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739774 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} err="failed to get container status \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.739792 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: E0220 10:06:20.740049 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740077 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} err="failed to get container status \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740095 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740306 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} err="failed to get container status \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740330 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740634 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} err="failed to get container status \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.740741 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.741236 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} err="failed to get container status \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.741321 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.741890 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} err="failed to get container status \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.741936 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742245 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} err="failed to get container status \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742277 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742547 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} err="failed to get container status \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742574 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742922 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} err="failed to get container status \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.742953 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.743208 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} err="failed to get container status \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.743240 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.744314 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} err="failed to get container status \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.744343 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.744909 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} err="failed to get container status \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.744935 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.745272 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} err="failed to get container status \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.745496 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746031 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} err="failed to get container status \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746055 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746413 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} err="failed to get container status \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746435 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746745 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} err="failed to get container status \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.746782 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747070 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} err="failed to get container status \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747095 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747370 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} err="failed to get container status \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747402 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747698 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} err="failed to get container status \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.747726 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748036 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} err="failed to get container status \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748068 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748334 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} err="failed to get container status \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748359 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748656 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} err="failed to get container status \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.748687 4962 scope.go:117] "RemoveContainer" containerID="632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749017 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af"} err="failed to get container status \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": rpc error: code = NotFound desc = could not find container \"632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af\": container with ID starting with 632f598a6bf02b19952a264fbef27b389b05daf66577d38a14dbb23054d239af not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749047 4962 scope.go:117] "RemoveContainer" containerID="0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749408 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add"} err="failed to get container status \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": rpc error: code = NotFound desc = could not find container \"0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add\": container with ID starting with 0ad31dcb6721b49faaa0b2df2afa2f3d990f410ad4c80b3b5ebb18c428603add not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749426 4962 scope.go:117] "RemoveContainer" containerID="195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749719 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8"} err="failed to get container status \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": rpc error: code = NotFound desc = could not find container \"195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8\": container with ID starting with 195d7a165caeaa3ae9ff588a2fa45d125813007846433c5a210adf71499466f8 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.749849 4962 scope.go:117] "RemoveContainer" containerID="50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.750281 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378"} err="failed to get container status \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": rpc error: code = NotFound desc = could not find container \"50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378\": container with ID starting with 50d02d6589c7c8b51645cdde92eb98c7263c73298b515034911e9cc861cf6378 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.750304 4962 scope.go:117] "RemoveContainer" containerID="582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.750753 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584"} err="failed to get container status \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": rpc error: code = NotFound desc = could not find container \"582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584\": container with ID starting with 582f827df96e4207c46b33e6f9966b0adfaf7f2e064e57b578dc67dbe2abe584 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.750811 4962 scope.go:117] "RemoveContainer" containerID="9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.751181 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f"} err="failed to get container status \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": rpc error: code = NotFound desc = could not find container \"9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f\": container with ID starting with 9f427b069dd344b5d772161581d74ed679cb54790844757764b9b02b0240780f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.751214 4962 scope.go:117] "RemoveContainer" containerID="03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.751529 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f"} err="failed to get container status \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": rpc error: code = NotFound desc = could not find container \"03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f\": container with ID starting with 03a138202bee7644da4ceaa46329c91dc26d82c51d8c67a77584855fed55d37f not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.751693 4962 scope.go:117] "RemoveContainer" containerID="36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.752107 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a"} err="failed to get container status \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": rpc error: code = NotFound desc = could not find container \"36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a\": container with ID starting with 36319c4e79c0c2e756fc98beec2365a16a77cd7e6af6682fbfad41a179d53a9a not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.752133 4962 scope.go:117] "RemoveContainer" containerID="2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.752701 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117"} err="failed to get container status \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": rpc error: code = NotFound desc = could not find container \"2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117\": container with ID starting with 2cb864fc1ed2fdf7491683366bd27cc79ef9e78dfb19dab1d549f76145c66117 not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.752749 4962 scope.go:117] "RemoveContainer" containerID="1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.753222 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe"} err="failed to get container status \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": rpc error: code = NotFound desc = could not find container \"1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe\": container with ID starting with 1aad6b26d64b1b11cffc4ee0c31edbd6ca1839549c4a0afa77a33735dcd2e8fe not found: ID does not exist" Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.852101 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-99b2s"] Feb 20 10:06:20 crc kubenswrapper[4962]: I0220 10:06:20.862867 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-99b2s"] Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.154644 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2abd2b70-bb78-49a0-b930-cd066384e803" path="/var/lib/kubelet/pods/2abd2b70-bb78-49a0-b930-cd066384e803/volumes" Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.484301 4962 generic.go:334] "Generic (PLEG): container finished" podID="e12256fd-84a5-4a79-b750-20b5a64bd4c9" containerID="1e0c0d88dc5ba3cd8d61d4bf1920c494da947e6d2fe514fe1984e301582cff94" exitCode=0 Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.485461 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerDied","Data":"1e0c0d88dc5ba3cd8d61d4bf1920c494da947e6d2fe514fe1984e301582cff94"} Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.485657 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"a175345ba64c0874825a85220209a3a8247dff3c0e0beb24f2d075a628b1279a"} Feb 20 10:06:21 crc kubenswrapper[4962]: I0220 10:06:21.493043 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/2.log" Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.506401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"92e3dde518304618e01ddb3cc717c779c80e50fc6b3751f36f735fae92c42114"} Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.507779 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"2f58ea068ce21b88343d88c9636ddb42f308bc3735b14e8567e338d309ce9d6a"} Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.507820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"afebb2b6d25667fd01a6758a1584dec69db4671c42b4025177214e7234b98039"} Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.507836 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"32bfd6870b90b35716f3e8be00392c97ed3eaaba2cd7a5263cffd2fae9b90665"} Feb 20 10:06:22 crc kubenswrapper[4962]: I0220 10:06:22.507847 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"5723db0484710231bd6b6e5a91643e47eb22ac3f7e4212e0f75b13cd108221f3"} Feb 20 10:06:23 crc kubenswrapper[4962]: I0220 10:06:23.521125 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"ac789f0ba10acfdc335d55607b1373243f9a97868dcdf1e1c1b083181e306baf"} Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.306740 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.308555 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.312331 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.316245 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.317040 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.316839 4962 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-bwxwq" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.358128 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.358236 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.358304 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.459710 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.459847 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.460043 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.460185 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.461054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.501457 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") pod \"crc-storage-crc-9v9g5\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.553462 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"fe1ae8910f5f801a05a874e20d8dc09e0d5a8ff072bd234b90cdff966cd46572"} Feb 20 10:06:25 crc kubenswrapper[4962]: I0220 10:06:25.628726 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: E0220 10:06:25.671013 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(40e31d86b4d5a1ae197fb07238a7f4f9a6b312119a93aaeffcb012d00970fee0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 10:06:25 crc kubenswrapper[4962]: E0220 10:06:25.671145 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(40e31d86b4d5a1ae197fb07238a7f4f9a6b312119a93aaeffcb012d00970fee0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: E0220 10:06:25.671185 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(40e31d86b4d5a1ae197fb07238a7f4f9a6b312119a93aaeffcb012d00970fee0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:25 crc kubenswrapper[4962]: E0220 10:06:25.671281 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(40e31d86b4d5a1ae197fb07238a7f4f9a6b312119a93aaeffcb012d00970fee0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9v9g5" podUID="6423ea5e-20ed-4977-a842-2bc521939341" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.582674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" event={"ID":"e12256fd-84a5-4a79-b750-20b5a64bd4c9","Type":"ContainerStarted","Data":"7b825f72902849a1879d093d4836387732d06857ded48764e201dfdb9b927cc6"} Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.583273 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.583298 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.583313 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.628165 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.633193 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" podStartSLOduration=7.633162822 podStartE2EDuration="7.633162822s" podCreationTimestamp="2026-02-20 10:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:06:27.621288642 +0000 UTC m=+679.203760508" watchObservedRunningTime="2026-02-20 10:06:27.633162822 +0000 UTC m=+679.215634668" Feb 20 10:06:27 crc kubenswrapper[4962]: I0220 10:06:27.657365 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:28 crc kubenswrapper[4962]: I0220 10:06:28.103477 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 10:06:28 crc kubenswrapper[4962]: I0220 10:06:28.103649 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:28 crc kubenswrapper[4962]: I0220 10:06:28.104216 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:28 crc kubenswrapper[4962]: E0220 10:06:28.135125 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(99e25b7d0b32fa32c7890472a0271039c39f35dccfb2b21357e2f277e54734e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 10:06:28 crc kubenswrapper[4962]: E0220 10:06:28.135265 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(99e25b7d0b32fa32c7890472a0271039c39f35dccfb2b21357e2f277e54734e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:28 crc kubenswrapper[4962]: E0220 10:06:28.135341 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(99e25b7d0b32fa32c7890472a0271039c39f35dccfb2b21357e2f277e54734e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:28 crc kubenswrapper[4962]: E0220 10:06:28.135445 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(99e25b7d0b32fa32c7890472a0271039c39f35dccfb2b21357e2f277e54734e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9v9g5" podUID="6423ea5e-20ed-4977-a842-2bc521939341" Feb 20 10:06:35 crc kubenswrapper[4962]: I0220 10:06:35.139189 4962 scope.go:117] "RemoveContainer" containerID="1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef" Feb 20 10:06:35 crc kubenswrapper[4962]: E0220 10:06:35.140175 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wqwgj_openshift-multus(1957ac70-30f9-48c2-a82b-72aa3b7a883a)\"" pod="openshift-multus/multus-wqwgj" podUID="1957ac70-30f9-48c2-a82b-72aa3b7a883a" Feb 20 10:06:40 crc kubenswrapper[4962]: I0220 10:06:40.138859 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:40 crc kubenswrapper[4962]: I0220 10:06:40.139349 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:40 crc kubenswrapper[4962]: E0220 10:06:40.168914 4962 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(713cdb0df4b4a43cbc42a56189ad0095c3557594cb124045197252aa33cb76b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 10:06:40 crc kubenswrapper[4962]: E0220 10:06:40.169255 4962 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(713cdb0df4b4a43cbc42a56189ad0095c3557594cb124045197252aa33cb76b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:40 crc kubenswrapper[4962]: E0220 10:06:40.169293 4962 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(713cdb0df4b4a43cbc42a56189ad0095c3557594cb124045197252aa33cb76b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:40 crc kubenswrapper[4962]: E0220 10:06:40.169359 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9v9g5_crc-storage(6423ea5e-20ed-4977-a842-2bc521939341)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9v9g5_crc-storage_6423ea5e-20ed-4977-a842-2bc521939341_0(713cdb0df4b4a43cbc42a56189ad0095c3557594cb124045197252aa33cb76b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9v9g5" podUID="6423ea5e-20ed-4977-a842-2bc521939341" Feb 20 10:06:41 crc kubenswrapper[4962]: I0220 10:06:41.508227 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:06:41 crc kubenswrapper[4962]: I0220 10:06:41.510050 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:06:48 crc kubenswrapper[4962]: I0220 10:06:48.139340 4962 scope.go:117] "RemoveContainer" containerID="1bcd3b5d415fdd3c80c493728dbec002cdd2c25c6bba4eb1580552f0bd623cef" Feb 20 10:06:48 crc kubenswrapper[4962]: I0220 10:06:48.728212 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqwgj_1957ac70-30f9-48c2-a82b-72aa3b7a883a/kube-multus/2.log" Feb 20 10:06:48 crc kubenswrapper[4962]: I0220 10:06:48.728746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqwgj" event={"ID":"1957ac70-30f9-48c2-a82b-72aa3b7a883a","Type":"ContainerStarted","Data":"6a05be812a6e15d39000d5ee5643496f369b2005317fcf7e7f04250bb5188bfc"} Feb 20 10:06:50 crc kubenswrapper[4962]: I0220 10:06:50.692384 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z6xc2" Feb 20 10:06:52 crc kubenswrapper[4962]: I0220 10:06:52.138497 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:52 crc kubenswrapper[4962]: I0220 10:06:52.139681 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:52 crc kubenswrapper[4962]: I0220 10:06:52.907144 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:06:52 crc kubenswrapper[4962]: I0220 10:06:52.907373 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 10:06:53 crc kubenswrapper[4962]: I0220 10:06:53.865362 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9v9g5" event={"ID":"6423ea5e-20ed-4977-a842-2bc521939341","Type":"ContainerStarted","Data":"24267bde3cf143a1f44599c25d38b0e2ca1b3bcf78870f4f856537b7cd68bd3f"} Feb 20 10:06:54 crc kubenswrapper[4962]: I0220 10:06:54.888486 4962 generic.go:334] "Generic (PLEG): container finished" podID="6423ea5e-20ed-4977-a842-2bc521939341" containerID="dd81866a8883595a9a43e5321d2a1e397058906782cb6839d22126aa9d907feb" exitCode=0 Feb 20 10:06:54 crc kubenswrapper[4962]: I0220 10:06:54.888577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9v9g5" event={"ID":"6423ea5e-20ed-4977-a842-2bc521939341","Type":"ContainerDied","Data":"dd81866a8883595a9a43e5321d2a1e397058906782cb6839d22126aa9d907feb"} Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.232993 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") pod \"6423ea5e-20ed-4977-a842-2bc521939341\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308375 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") pod \"6423ea5e-20ed-4977-a842-2bc521939341\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308412 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") pod \"6423ea5e-20ed-4977-a842-2bc521939341\" (UID: \"6423ea5e-20ed-4977-a842-2bc521939341\") " Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308449 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6423ea5e-20ed-4977-a842-2bc521939341" (UID: "6423ea5e-20ed-4977-a842-2bc521939341"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.308705 4962 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6423ea5e-20ed-4977-a842-2bc521939341-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.313292 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4" (OuterVolumeSpecName: "kube-api-access-zszw4") pod "6423ea5e-20ed-4977-a842-2bc521939341" (UID: "6423ea5e-20ed-4977-a842-2bc521939341"). InnerVolumeSpecName "kube-api-access-zszw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.330724 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6423ea5e-20ed-4977-a842-2bc521939341" (UID: "6423ea5e-20ed-4977-a842-2bc521939341"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.410533 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zszw4\" (UniqueName: \"kubernetes.io/projected/6423ea5e-20ed-4977-a842-2bc521939341-kube-api-access-zszw4\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.410582 4962 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6423ea5e-20ed-4977-a842-2bc521939341-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.908119 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9v9g5" event={"ID":"6423ea5e-20ed-4977-a842-2bc521939341","Type":"ContainerDied","Data":"24267bde3cf143a1f44599c25d38b0e2ca1b3bcf78870f4f856537b7cd68bd3f"} Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.908202 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24267bde3cf143a1f44599c25d38b0e2ca1b3bcf78870f4f856537b7cd68bd3f" Feb 20 10:06:56 crc kubenswrapper[4962]: I0220 10:06:56.908246 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9v9g5" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.095058 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626"] Feb 20 10:07:04 crc kubenswrapper[4962]: E0220 10:07:04.096572 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6423ea5e-20ed-4977-a842-2bc521939341" containerName="storage" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.096699 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6423ea5e-20ed-4977-a842-2bc521939341" containerName="storage" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.097409 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6423ea5e-20ed-4977-a842-2bc521939341" containerName="storage" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.100692 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.104209 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.117357 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626"] Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.231877 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.231989 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.232030 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.333923 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.334243 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.334386 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.335253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.336043 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.352815 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.452623 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.635085 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626"] Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.974059 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerStarted","Data":"d91be3332b27241cee61bcd7398abd7cbbcb1f51878aeb387f65fde8ffd73790"} Feb 20 10:07:04 crc kubenswrapper[4962]: I0220 10:07:04.974176 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerStarted","Data":"72a64b6a19e16dda103631e55a0bb6921c7420e56b8354feeb8259198c1f1b3c"} Feb 20 10:07:05 crc kubenswrapper[4962]: I0220 10:07:05.986018 4962 generic.go:334] "Generic (PLEG): container finished" podID="650d9c53-94de-499d-8498-53afa3428c06" containerID="d91be3332b27241cee61bcd7398abd7cbbcb1f51878aeb387f65fde8ffd73790" exitCode=0 Feb 20 10:07:05 crc kubenswrapper[4962]: I0220 10:07:05.986166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerDied","Data":"d91be3332b27241cee61bcd7398abd7cbbcb1f51878aeb387f65fde8ffd73790"} Feb 20 10:07:07 crc kubenswrapper[4962]: I0220 10:07:07.998949 4962 generic.go:334] "Generic (PLEG): container finished" podID="650d9c53-94de-499d-8498-53afa3428c06" containerID="4f992657730342a0f2eba9a2f8eaffbf3b181d1276791b34eb4ef99967e83016" exitCode=0 Feb 20 10:07:07 crc kubenswrapper[4962]: I0220 10:07:07.999002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerDied","Data":"4f992657730342a0f2eba9a2f8eaffbf3b181d1276791b34eb4ef99967e83016"} Feb 20 10:07:09 crc kubenswrapper[4962]: I0220 10:07:09.011901 4962 generic.go:334] "Generic (PLEG): container finished" podID="650d9c53-94de-499d-8498-53afa3428c06" containerID="acb768d317e1555893a5b7aedc9f487d7cb71f8dabc978f5cafa020c7b1863ba" exitCode=0 Feb 20 10:07:09 crc kubenswrapper[4962]: I0220 10:07:09.011982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerDied","Data":"acb768d317e1555893a5b7aedc9f487d7cb71f8dabc978f5cafa020c7b1863ba"} Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.315084 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.445353 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") pod \"650d9c53-94de-499d-8498-53afa3428c06\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.445439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") pod \"650d9c53-94de-499d-8498-53afa3428c06\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.445587 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") pod \"650d9c53-94de-499d-8498-53afa3428c06\" (UID: \"650d9c53-94de-499d-8498-53afa3428c06\") " Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.446121 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle" (OuterVolumeSpecName: "bundle") pod "650d9c53-94de-499d-8498-53afa3428c06" (UID: "650d9c53-94de-499d-8498-53afa3428c06"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.450951 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s" (OuterVolumeSpecName: "kube-api-access-x496s") pod "650d9c53-94de-499d-8498-53afa3428c06" (UID: "650d9c53-94de-499d-8498-53afa3428c06"). InnerVolumeSpecName "kube-api-access-x496s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.548022 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.548072 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x496s\" (UniqueName: \"kubernetes.io/projected/650d9c53-94de-499d-8498-53afa3428c06-kube-api-access-x496s\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.553311 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util" (OuterVolumeSpecName: "util") pod "650d9c53-94de-499d-8498-53afa3428c06" (UID: "650d9c53-94de-499d-8498-53afa3428c06"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:10 crc kubenswrapper[4962]: I0220 10:07:10.649008 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/650d9c53-94de-499d-8498-53afa3428c06-util\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.030784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" event={"ID":"650d9c53-94de-499d-8498-53afa3428c06","Type":"ContainerDied","Data":"72a64b6a19e16dda103631e55a0bb6921c7420e56b8354feeb8259198c1f1b3c"} Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.030834 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72a64b6a19e16dda103631e55a0bb6921c7420e56b8354feeb8259198c1f1b3c" Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.030888 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626" Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.508425 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:07:11 crc kubenswrapper[4962]: I0220 10:07:11.508948 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162345 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-nkzm2"] Feb 20 10:07:13 crc kubenswrapper[4962]: E0220 10:07:13.162708 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="extract" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162726 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="extract" Feb 20 10:07:13 crc kubenswrapper[4962]: E0220 10:07:13.162745 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="util" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162754 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="util" Feb 20 10:07:13 crc kubenswrapper[4962]: E0220 10:07:13.162766 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="pull" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162774 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="pull" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.162890 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="650d9c53-94de-499d-8498-53afa3428c06" containerName="extract" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.163425 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.170404 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-46q2p" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.171154 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.175094 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.175289 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-nkzm2"] Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.313407 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4jj\" (UniqueName: \"kubernetes.io/projected/cffc71cf-18b7-4733-b863-19b8664b5cf4-kube-api-access-xx4jj\") pod \"nmstate-operator-694c9596b7-nkzm2\" (UID: \"cffc71cf-18b7-4733-b863-19b8664b5cf4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.415041 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4jj\" (UniqueName: \"kubernetes.io/projected/cffc71cf-18b7-4733-b863-19b8664b5cf4-kube-api-access-xx4jj\") pod \"nmstate-operator-694c9596b7-nkzm2\" (UID: \"cffc71cf-18b7-4733-b863-19b8664b5cf4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.433885 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4jj\" (UniqueName: \"kubernetes.io/projected/cffc71cf-18b7-4733-b863-19b8664b5cf4-kube-api-access-xx4jj\") pod \"nmstate-operator-694c9596b7-nkzm2\" (UID: \"cffc71cf-18b7-4733-b863-19b8664b5cf4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.483576 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" Feb 20 10:07:13 crc kubenswrapper[4962]: I0220 10:07:13.679565 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-nkzm2"] Feb 20 10:07:13 crc kubenswrapper[4962]: W0220 10:07:13.691727 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcffc71cf_18b7_4733_b863_19b8664b5cf4.slice/crio-c9ec9d6dcf363c113f2d52a43d4b95553ef776e08476fce579f6f6b560d45443 WatchSource:0}: Error finding container c9ec9d6dcf363c113f2d52a43d4b95553ef776e08476fce579f6f6b560d45443: Status 404 returned error can't find the container with id c9ec9d6dcf363c113f2d52a43d4b95553ef776e08476fce579f6f6b560d45443 Feb 20 10:07:14 crc kubenswrapper[4962]: I0220 10:07:14.050860 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" event={"ID":"cffc71cf-18b7-4733-b863-19b8664b5cf4","Type":"ContainerStarted","Data":"c9ec9d6dcf363c113f2d52a43d4b95553ef776e08476fce579f6f6b560d45443"} Feb 20 10:07:16 crc kubenswrapper[4962]: I0220 10:07:16.064916 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" event={"ID":"cffc71cf-18b7-4733-b863-19b8664b5cf4","Type":"ContainerStarted","Data":"7d1a3616fffbfda6dbc09182574a8da7aa4ffec6021e3f24b359b239cfb0e195"} Feb 20 10:07:16 crc kubenswrapper[4962]: I0220 10:07:16.082716 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-nkzm2" podStartSLOduration=1.332512538 podStartE2EDuration="3.082694122s" podCreationTimestamp="2026-02-20 10:07:13 +0000 UTC" firstStartedPulling="2026-02-20 10:07:13.694986197 +0000 UTC m=+725.277458043" lastFinishedPulling="2026-02-20 10:07:15.445167781 +0000 UTC m=+727.027639627" observedRunningTime="2026-02-20 10:07:16.079194702 +0000 UTC m=+727.661666558" watchObservedRunningTime="2026-02-20 10:07:16.082694122 +0000 UTC m=+727.665165968" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.270504 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.272507 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.275956 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tp5zn" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.281788 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.288228 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.289147 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.296043 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-frtsf"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.297076 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.301700 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.315263 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386285 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqwp\" (UniqueName: \"kubernetes.io/projected/edcc687e-09ef-4048-8db7-d67e6fe23212-kube-api-access-6nqwp\") pod \"nmstate-metrics-58c85c668d-6x8wh\" (UID: \"edcc687e-09ef-4048-8db7-d67e6fe23212\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-ovs-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386369 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-nmstate-lock\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386445 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkj9j\" (UniqueName: \"kubernetes.io/projected/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-kube-api-access-zkj9j\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386471 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-dbus-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386566 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.386608 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7dj\" (UniqueName: \"kubernetes.io/projected/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-kube-api-access-rb7dj\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.409356 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.410118 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.421135 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.421176 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.426930 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-27lrn" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.428023 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488247 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-nmstate-lock\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488332 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll4cj\" (UniqueName: \"kubernetes.io/projected/e17e90c9-fe19-4544-9a79-bffc8072a763-kube-api-access-ll4cj\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488361 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkj9j\" (UniqueName: \"kubernetes.io/projected/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-kube-api-access-zkj9j\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488383 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-dbus-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488446 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488516 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7dj\" (UniqueName: \"kubernetes.io/projected/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-kube-api-access-rb7dj\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488581 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqwp\" (UniqueName: \"kubernetes.io/projected/edcc687e-09ef-4048-8db7-d67e6fe23212-kube-api-access-6nqwp\") pod \"nmstate-metrics-58c85c668d-6x8wh\" (UID: \"edcc687e-09ef-4048-8db7-d67e6fe23212\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488659 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-ovs-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488679 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e17e90c9-fe19-4544-9a79-bffc8072a763-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.488751 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-nmstate-lock\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: E0220 10:07:24.489292 4962 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 20 10:07:24 crc kubenswrapper[4962]: E0220 10:07:24.489353 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair podName:a453e12b-e95c-4c04-b67b-b5bc6527a3ab nodeName:}" failed. No retries permitted until 2026-02-20 10:07:24.989332613 +0000 UTC m=+736.571804459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair") pod "nmstate-webhook-866bcb46dc-l2lqb" (UID: "a453e12b-e95c-4c04-b67b-b5bc6527a3ab") : secret "openshift-nmstate-webhook" not found Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.489516 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-ovs-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.489297 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-dbus-socket\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.511262 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqwp\" (UniqueName: \"kubernetes.io/projected/edcc687e-09ef-4048-8db7-d67e6fe23212-kube-api-access-6nqwp\") pod \"nmstate-metrics-58c85c668d-6x8wh\" (UID: \"edcc687e-09ef-4048-8db7-d67e6fe23212\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.524524 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkj9j\" (UniqueName: \"kubernetes.io/projected/5056ae4f-c2f7-41f5-8e12-b7b5d8996852-kube-api-access-zkj9j\") pod \"nmstate-handler-frtsf\" (UID: \"5056ae4f-c2f7-41f5-8e12-b7b5d8996852\") " pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.524620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7dj\" (UniqueName: \"kubernetes.io/projected/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-kube-api-access-rb7dj\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.590340 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e17e90c9-fe19-4544-9a79-bffc8072a763-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.590410 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll4cj\" (UniqueName: \"kubernetes.io/projected/e17e90c9-fe19-4544-9a79-bffc8072a763-kube-api-access-ll4cj\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.590452 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: E0220 10:07:24.590659 4962 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 20 10:07:24 crc kubenswrapper[4962]: E0220 10:07:24.590723 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert podName:e17e90c9-fe19-4544-9a79-bffc8072a763 nodeName:}" failed. No retries permitted until 2026-02-20 10:07:25.090701561 +0000 UTC m=+736.673173397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-2hqd7" (UID: "e17e90c9-fe19-4544-9a79-bffc8072a763") : secret "plugin-serving-cert" not found Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.591961 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e17e90c9-fe19-4544-9a79-bffc8072a763-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.601627 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.608956 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f9d58689-28fst"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.610409 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.619329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll4cj\" (UniqueName: \"kubernetes.io/projected/e17e90c9-fe19-4544-9a79-bffc8072a763-kube-api-access-ll4cj\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.627089 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.689739 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f9d58689-28fst"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.691931 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.691983 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhq2\" (UniqueName: \"kubernetes.io/projected/0de52220-59c4-423b-80c3-b737466ac45f-kube-api-access-6qhq2\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692025 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-trusted-ca-bundle\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692097 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-service-ca\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692119 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-console-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692163 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-oauth-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.692180 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-oauth-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.793884 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.793955 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhq2\" (UniqueName: \"kubernetes.io/projected/0de52220-59c4-423b-80c3-b737466ac45f-kube-api-access-6qhq2\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.793989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-trusted-ca-bundle\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.794058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-service-ca\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.794212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-console-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.794420 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-oauth-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.794458 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-oauth-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.795087 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-service-ca\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.795449 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-oauth-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.796549 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-console-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.799755 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0de52220-59c4-423b-80c3-b737466ac45f-trusted-ca-bundle\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.802467 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-oauth-config\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.806244 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de52220-59c4-423b-80c3-b737466ac45f-console-serving-cert\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.812170 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhq2\" (UniqueName: \"kubernetes.io/projected/0de52220-59c4-423b-80c3-b737466ac45f-kube-api-access-6qhq2\") pod \"console-7f9d58689-28fst\" (UID: \"0de52220-59c4-423b-80c3-b737466ac45f\") " pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.850154 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh"] Feb 20 10:07:24 crc kubenswrapper[4962]: I0220 10:07:24.997719 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.000781 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.001523 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a453e12b-e95c-4c04-b67b-b5bc6527a3ab-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-l2lqb\" (UID: \"a453e12b-e95c-4c04-b67b-b5bc6527a3ab\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.099785 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.103895 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e17e90c9-fe19-4544-9a79-bffc8072a763-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-2hqd7\" (UID: \"e17e90c9-fe19-4544-9a79-bffc8072a763\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.121967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-frtsf" event={"ID":"5056ae4f-c2f7-41f5-8e12-b7b5d8996852","Type":"ContainerStarted","Data":"72b6617c0e471f826f5fc5f1b60506b802b42453bae6a1cefce486c81091337f"} Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.123120 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" event={"ID":"edcc687e-09ef-4048-8db7-d67e6fe23212","Type":"ContainerStarted","Data":"8b9b247a0dd6e5e8b90ee0af81501f91d9e8a8835eb8907bb9a741ad95002d55"} Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.215859 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.244450 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f9d58689-28fst"] Feb 20 10:07:25 crc kubenswrapper[4962]: W0220 10:07:25.253741 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de52220_59c4_423b_80c3_b737466ac45f.slice/crio-99fba780e600e3cc03255a1a0738f81cbe52918b0b1a0f213573e1346807ac25 WatchSource:0}: Error finding container 99fba780e600e3cc03255a1a0738f81cbe52918b0b1a0f213573e1346807ac25: Status 404 returned error can't find the container with id 99fba780e600e3cc03255a1a0738f81cbe52918b0b1a0f213573e1346807ac25 Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.335881 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.480786 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb"] Feb 20 10:07:25 crc kubenswrapper[4962]: I0220 10:07:25.575280 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7"] Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.130752 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" event={"ID":"a453e12b-e95c-4c04-b67b-b5bc6527a3ab","Type":"ContainerStarted","Data":"f5aae3b0bdc26260dee4febc6406bb1315eff3d733d47005692e3dbe68f85c77"} Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.132883 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f9d58689-28fst" event={"ID":"0de52220-59c4-423b-80c3-b737466ac45f","Type":"ContainerStarted","Data":"9670c5a06f5878d5fd68f1475e9173d76b30ab24f827a7ae446f35372fa9c420"} Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.132936 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f9d58689-28fst" event={"ID":"0de52220-59c4-423b-80c3-b737466ac45f","Type":"ContainerStarted","Data":"99fba780e600e3cc03255a1a0738f81cbe52918b0b1a0f213573e1346807ac25"} Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.134449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" event={"ID":"e17e90c9-fe19-4544-9a79-bffc8072a763","Type":"ContainerStarted","Data":"cf7d427dabfe35c4069684117a2c755c5d41c3ce4825a1681c1d2fd34a2cbefa"} Feb 20 10:07:26 crc kubenswrapper[4962]: I0220 10:07:26.153709 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f9d58689-28fst" podStartSLOduration=2.153686568 podStartE2EDuration="2.153686568s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:07:26.148773834 +0000 UTC m=+737.731245700" watchObservedRunningTime="2026-02-20 10:07:26.153686568 +0000 UTC m=+737.736158434" Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.162349 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-frtsf" event={"ID":"5056ae4f-c2f7-41f5-8e12-b7b5d8996852","Type":"ContainerStarted","Data":"6d065815732fe6d25416667707bbe455c920b024a686ef500398a7c7231959af"} Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.164411 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.166414 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" event={"ID":"edcc687e-09ef-4048-8db7-d67e6fe23212","Type":"ContainerStarted","Data":"5a2415cc832d88aebc7f468292ad0edc892c0d406540efcd8044035e762e0d50"} Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.169451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" event={"ID":"a453e12b-e95c-4c04-b67b-b5bc6527a3ab","Type":"ContainerStarted","Data":"f15f78fd82d3623376ada9062cfec324b7a9ff4daeacc55c90514a085bb07a45"} Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.171289 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.181576 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-frtsf" podStartSLOduration=1.8896062150000001 podStartE2EDuration="4.181542706s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="2026-02-20 10:07:24.669395367 +0000 UTC m=+736.251867213" lastFinishedPulling="2026-02-20 10:07:26.961331858 +0000 UTC m=+738.543803704" observedRunningTime="2026-02-20 10:07:28.176436044 +0000 UTC m=+739.758907920" watchObservedRunningTime="2026-02-20 10:07:28.181542706 +0000 UTC m=+739.764014562" Feb 20 10:07:28 crc kubenswrapper[4962]: I0220 10:07:28.201169 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" podStartSLOduration=2.739946267 podStartE2EDuration="4.201137092s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="2026-02-20 10:07:25.501063043 +0000 UTC m=+737.083534889" lastFinishedPulling="2026-02-20 10:07:26.962253858 +0000 UTC m=+738.544725714" observedRunningTime="2026-02-20 10:07:28.199311354 +0000 UTC m=+739.781783220" watchObservedRunningTime="2026-02-20 10:07:28.201137092 +0000 UTC m=+739.783608948" Feb 20 10:07:29 crc kubenswrapper[4962]: I0220 10:07:29.177633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" event={"ID":"e17e90c9-fe19-4544-9a79-bffc8072a763","Type":"ContainerStarted","Data":"194a6a737cbb07a12489a7fc808522ffde9a3ac0211bd4734fbcb7f5624518e2"} Feb 20 10:07:29 crc kubenswrapper[4962]: I0220 10:07:29.197903 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-2hqd7" podStartSLOduration=2.046729685 podStartE2EDuration="5.19787573s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="2026-02-20 10:07:25.586056016 +0000 UTC m=+737.168527862" lastFinishedPulling="2026-02-20 10:07:28.737202061 +0000 UTC m=+740.319673907" observedRunningTime="2026-02-20 10:07:29.191044494 +0000 UTC m=+740.773516350" watchObservedRunningTime="2026-02-20 10:07:29.19787573 +0000 UTC m=+740.780347576" Feb 20 10:07:31 crc kubenswrapper[4962]: I0220 10:07:31.193555 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" event={"ID":"edcc687e-09ef-4048-8db7-d67e6fe23212","Type":"ContainerStarted","Data":"693b17bbe55b417d4db6934ad6b457e3bf3190e3faf7fd1afcfa29de1bbd2439"} Feb 20 10:07:31 crc kubenswrapper[4962]: I0220 10:07:31.222352 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6x8wh" podStartSLOduration=1.9733537559999998 podStartE2EDuration="7.222328729s" podCreationTimestamp="2026-02-20 10:07:24 +0000 UTC" firstStartedPulling="2026-02-20 10:07:24.858529625 +0000 UTC m=+736.441001471" lastFinishedPulling="2026-02-20 10:07:30.107504598 +0000 UTC m=+741.689976444" observedRunningTime="2026-02-20 10:07:31.219246163 +0000 UTC m=+742.801718049" watchObservedRunningTime="2026-02-20 10:07:31.222328729 +0000 UTC m=+742.804800595" Feb 20 10:07:34 crc kubenswrapper[4962]: I0220 10:07:34.670451 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-frtsf" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.000963 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.001038 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.008555 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.232687 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f9d58689-28fst" Feb 20 10:07:35 crc kubenswrapper[4962]: I0220 10:07:35.320113 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 10:07:40 crc kubenswrapper[4962]: I0220 10:07:40.229505 4962 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.508006 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.508654 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.508774 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.510347 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:07:41 crc kubenswrapper[4962]: I0220 10:07:41.510552 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14" gracePeriod=600 Feb 20 10:07:42 crc kubenswrapper[4962]: I0220 10:07:42.284806 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14" exitCode=0 Feb 20 10:07:42 crc kubenswrapper[4962]: I0220 10:07:42.284872 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14"} Feb 20 10:07:42 crc kubenswrapper[4962]: I0220 10:07:42.285341 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce"} Feb 20 10:07:42 crc kubenswrapper[4962]: I0220 10:07:42.285360 4962 scope.go:117] "RemoveContainer" containerID="1b8acb71d346ae4db2a885f82208122a00bcf52e171aaee5f30a374f13e64838" Feb 20 10:07:45 crc kubenswrapper[4962]: I0220 10:07:45.225226 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-l2lqb" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.781048 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp"] Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.784126 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.786916 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.800981 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp"] Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.863953 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.863990 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.864085 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.965237 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.965285 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.965331 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.965941 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.966068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:07:59 crc kubenswrapper[4962]: I0220 10:07:59.992836 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.104571 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.341347 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp"] Feb 20 10:08:00 crc kubenswrapper[4962]: W0220 10:08:00.347660 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15223064_e16f_4407_a15a_2105151aa73f.slice/crio-c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e WatchSource:0}: Error finding container c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e: Status 404 returned error can't find the container with id c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.372914 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nwfk6" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" containerID="cri-o://8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" gracePeriod=15 Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.416075 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerStarted","Data":"c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e"} Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.723379 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nwfk6_09cfdba9-bfda-455d-b13e-58a6ea5a7d5a/console/0.log" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.723479 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777335 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777461 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777526 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777666 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.777829 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") pod \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\" (UID: \"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a\") " Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.778406 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.778451 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca" (OuterVolumeSpecName: "service-ca") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.778499 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.778557 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config" (OuterVolumeSpecName: "console-config") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.782838 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.782901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2" (OuterVolumeSpecName: "kube-api-access-h5fb2") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "kube-api-access-h5fb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.783080 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" (UID: "09cfdba9-bfda-455d-b13e-58a6ea5a7d5a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879831 4962 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879870 4962 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879884 4962 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879896 4962 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879910 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5fb2\" (UniqueName: \"kubernetes.io/projected/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-kube-api-access-h5fb2\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879923 4962 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:00 crc kubenswrapper[4962]: I0220 10:08:00.879932 4962 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.107928 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:01 crc kubenswrapper[4962]: E0220 10:08:01.108351 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.108385 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.108585 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.110138 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.117650 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.183813 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.184243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.184302 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.285414 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.285505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.285566 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.286348 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.286442 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.312158 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") pod \"redhat-operators-gq5bb\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423457 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nwfk6_09cfdba9-bfda-455d-b13e-58a6ea5a7d5a/console/0.log" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423504 4962 generic.go:334] "Generic (PLEG): container finished" podID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerID="8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" exitCode=2 Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423558 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwfk6" event={"ID":"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a","Type":"ContainerDied","Data":"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127"} Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nwfk6" event={"ID":"09cfdba9-bfda-455d-b13e-58a6ea5a7d5a","Type":"ContainerDied","Data":"7a175f5752b9da8dd07abe01e0077ca08911cfa3fb3fa2f627ad42bdc14904eb"} Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423645 4962 scope.go:117] "RemoveContainer" containerID="8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.423771 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nwfk6" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.427225 4962 generic.go:334] "Generic (PLEG): container finished" podID="15223064-e16f-4407-a15a-2105151aa73f" containerID="52e884a20e90cce670d676b67988c352a46ad24b3200f1d7087c910ee7e23935" exitCode=0 Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.427278 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerDied","Data":"52e884a20e90cce670d676b67988c352a46ad24b3200f1d7087c910ee7e23935"} Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.452184 4962 scope.go:117] "RemoveContainer" containerID="8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" Feb 20 10:08:01 crc kubenswrapper[4962]: E0220 10:08:01.452675 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127\": container with ID starting with 8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127 not found: ID does not exist" containerID="8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.452735 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127"} err="failed to get container status \"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127\": rpc error: code = NotFound desc = could not find container \"8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127\": container with ID starting with 8efe068ae3db37985f3f075f7b5d35cc12007aa816e53b833f3b7fb4a6ba9127 not found: ID does not exist" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.470473 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.478383 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nwfk6"] Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.484093 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.584344 4962 patch_prober.go:28] interesting pod/console-f9d7485db-nwfk6 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.584433 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-nwfk6" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 10:08:01 crc kubenswrapper[4962]: I0220 10:08:01.698902 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:01 crc kubenswrapper[4962]: W0220 10:08:01.706908 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2998604a_1adc_4333_9c8a_a4128085b7ce.slice/crio-2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448 WatchSource:0}: Error finding container 2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448: Status 404 returned error can't find the container with id 2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448 Feb 20 10:08:02 crc kubenswrapper[4962]: I0220 10:08:02.433831 4962 generic.go:334] "Generic (PLEG): container finished" podID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerID="c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301" exitCode=0 Feb 20 10:08:02 crc kubenswrapper[4962]: I0220 10:08:02.433908 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerDied","Data":"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301"} Feb 20 10:08:02 crc kubenswrapper[4962]: I0220 10:08:02.434305 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerStarted","Data":"2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448"} Feb 20 10:08:03 crc kubenswrapper[4962]: I0220 10:08:03.151091 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09cfdba9-bfda-455d-b13e-58a6ea5a7d5a" path="/var/lib/kubelet/pods/09cfdba9-bfda-455d-b13e-58a6ea5a7d5a/volumes" Feb 20 10:08:03 crc kubenswrapper[4962]: I0220 10:08:03.443847 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerStarted","Data":"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef"} Feb 20 10:08:03 crc kubenswrapper[4962]: I0220 10:08:03.447045 4962 generic.go:334] "Generic (PLEG): container finished" podID="15223064-e16f-4407-a15a-2105151aa73f" containerID="153bb86b7a4f78ff9046ba5d361d4e56f087ef2bc141daa0bea58590d78beda6" exitCode=0 Feb 20 10:08:03 crc kubenswrapper[4962]: I0220 10:08:03.447090 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerDied","Data":"153bb86b7a4f78ff9046ba5d361d4e56f087ef2bc141daa0bea58590d78beda6"} Feb 20 10:08:04 crc kubenswrapper[4962]: I0220 10:08:04.459679 4962 generic.go:334] "Generic (PLEG): container finished" podID="15223064-e16f-4407-a15a-2105151aa73f" containerID="a0978366b2c5ace9202c0106b4e1591f07091e0ebd4532dff4e4a9227ba670b8" exitCode=0 Feb 20 10:08:04 crc kubenswrapper[4962]: I0220 10:08:04.459786 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerDied","Data":"a0978366b2c5ace9202c0106b4e1591f07091e0ebd4532dff4e4a9227ba670b8"} Feb 20 10:08:04 crc kubenswrapper[4962]: I0220 10:08:04.463566 4962 generic.go:334] "Generic (PLEG): container finished" podID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerID="daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef" exitCode=0 Feb 20 10:08:04 crc kubenswrapper[4962]: I0220 10:08:04.463646 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerDied","Data":"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef"} Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.475857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerStarted","Data":"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8"} Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.513012 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gq5bb" podStartSLOduration=2.107882801 podStartE2EDuration="4.512950522s" podCreationTimestamp="2026-02-20 10:08:01 +0000 UTC" firstStartedPulling="2026-02-20 10:08:02.454288705 +0000 UTC m=+774.036760551" lastFinishedPulling="2026-02-20 10:08:04.859356426 +0000 UTC m=+776.441828272" observedRunningTime="2026-02-20 10:08:05.508968177 +0000 UTC m=+777.091440073" watchObservedRunningTime="2026-02-20 10:08:05.512950522 +0000 UTC m=+777.095422398" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.740776 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.764496 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") pod \"15223064-e16f-4407-a15a-2105151aa73f\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.764676 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") pod \"15223064-e16f-4407-a15a-2105151aa73f\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.764710 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") pod \"15223064-e16f-4407-a15a-2105151aa73f\" (UID: \"15223064-e16f-4407-a15a-2105151aa73f\") " Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.765874 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle" (OuterVolumeSpecName: "bundle") pod "15223064-e16f-4407-a15a-2105151aa73f" (UID: "15223064-e16f-4407-a15a-2105151aa73f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.766533 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.775954 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj" (OuterVolumeSpecName: "kube-api-access-qqvpj") pod "15223064-e16f-4407-a15a-2105151aa73f" (UID: "15223064-e16f-4407-a15a-2105151aa73f"). InnerVolumeSpecName "kube-api-access-qqvpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.778683 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util" (OuterVolumeSpecName: "util") pod "15223064-e16f-4407-a15a-2105151aa73f" (UID: "15223064-e16f-4407-a15a-2105151aa73f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.868492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqvpj\" (UniqueName: \"kubernetes.io/projected/15223064-e16f-4407-a15a-2105151aa73f-kube-api-access-qqvpj\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:05 crc kubenswrapper[4962]: I0220 10:08:05.868543 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15223064-e16f-4407-a15a-2105151aa73f-util\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:06 crc kubenswrapper[4962]: I0220 10:08:06.487750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" event={"ID":"15223064-e16f-4407-a15a-2105151aa73f","Type":"ContainerDied","Data":"c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e"} Feb 20 10:08:06 crc kubenswrapper[4962]: I0220 10:08:06.487778 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp" Feb 20 10:08:06 crc kubenswrapper[4962]: I0220 10:08:06.487809 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c998d8095c545e5edf199bdb4e4c23930fc190a758f899cc20eef2278b6cc40e" Feb 20 10:08:11 crc kubenswrapper[4962]: I0220 10:08:11.484521 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:11 crc kubenswrapper[4962]: I0220 10:08:11.485699 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:12 crc kubenswrapper[4962]: I0220 10:08:12.545572 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gq5bb" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" probeResult="failure" output=< Feb 20 10:08:12 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:08:12 crc kubenswrapper[4962]: > Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.720796 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj"] Feb 20 10:08:17 crc kubenswrapper[4962]: E0220 10:08:17.721311 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="extract" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721326 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="extract" Feb 20 10:08:17 crc kubenswrapper[4962]: E0220 10:08:17.721344 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="util" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721352 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="util" Feb 20 10:08:17 crc kubenswrapper[4962]: E0220 10:08:17.721362 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="pull" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721369 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="pull" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721459 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="15223064-e16f-4407-a15a-2105151aa73f" containerName="extract" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.721863 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.738739 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.739043 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.739205 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-cswgf" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.747119 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.747740 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.754748 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-webhook-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.754873 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph9l7\" (UniqueName: \"kubernetes.io/projected/403ba47d-bbe1-48f6-9382-47f12bbb75ae-kube-api-access-ph9l7\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.755023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-apiservice-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.763913 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj"] Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.856498 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-apiservice-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.856617 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-webhook-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.856641 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9l7\" (UniqueName: \"kubernetes.io/projected/403ba47d-bbe1-48f6-9382-47f12bbb75ae-kube-api-access-ph9l7\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.869565 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-apiservice-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.869566 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/403ba47d-bbe1-48f6-9382-47f12bbb75ae-webhook-cert\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:17 crc kubenswrapper[4962]: I0220 10:08:17.874456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9l7\" (UniqueName: \"kubernetes.io/projected/403ba47d-bbe1-48f6-9382-47f12bbb75ae-kube-api-access-ph9l7\") pod \"metallb-operator-controller-manager-7964458f8b-6fxbj\" (UID: \"403ba47d-bbe1-48f6-9382-47f12bbb75ae\") " pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.054997 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.061064 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd"] Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.061961 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.064181 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.064705 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.064752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-tjmf9" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.133222 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd"] Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.160548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-webhook-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.160630 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr6m5\" (UniqueName: \"kubernetes.io/projected/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-kube-api-access-hr6m5\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.160666 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-apiservice-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.261437 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-apiservice-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.262402 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-webhook-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.262433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr6m5\" (UniqueName: \"kubernetes.io/projected/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-kube-api-access-hr6m5\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.267339 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-apiservice-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.283734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr6m5\" (UniqueName: \"kubernetes.io/projected/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-kube-api-access-hr6m5\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.284392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ae49f4e-271b-40e8-9cfc-9857fc2de6f3-webhook-cert\") pod \"metallb-operator-webhook-server-79fb478cb4-wmzpd\" (UID: \"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3\") " pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.425135 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.513581 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj"] Feb 20 10:08:18 crc kubenswrapper[4962]: W0220 10:08:18.535741 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403ba47d_bbe1_48f6_9382_47f12bbb75ae.slice/crio-15c73fa11bd6306362ac9fd9f5f448136c1532c1a28bc23d994a0d3bf4e43f03 WatchSource:0}: Error finding container 15c73fa11bd6306362ac9fd9f5f448136c1532c1a28bc23d994a0d3bf4e43f03: Status 404 returned error can't find the container with id 15c73fa11bd6306362ac9fd9f5f448136c1532c1a28bc23d994a0d3bf4e43f03 Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.579916 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" event={"ID":"403ba47d-bbe1-48f6-9382-47f12bbb75ae","Type":"ContainerStarted","Data":"15c73fa11bd6306362ac9fd9f5f448136c1532c1a28bc23d994a0d3bf4e43f03"} Feb 20 10:08:18 crc kubenswrapper[4962]: I0220 10:08:18.903919 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd"] Feb 20 10:08:18 crc kubenswrapper[4962]: W0220 10:08:18.914323 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ae49f4e_271b_40e8_9cfc_9857fc2de6f3.slice/crio-85eee88bbc5acd22fd7795c374de672756961e04fa4302da66697e96ff02cbe1 WatchSource:0}: Error finding container 85eee88bbc5acd22fd7795c374de672756961e04fa4302da66697e96ff02cbe1: Status 404 returned error can't find the container with id 85eee88bbc5acd22fd7795c374de672756961e04fa4302da66697e96ff02cbe1 Feb 20 10:08:19 crc kubenswrapper[4962]: I0220 10:08:19.595229 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" event={"ID":"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3","Type":"ContainerStarted","Data":"85eee88bbc5acd22fd7795c374de672756961e04fa4302da66697e96ff02cbe1"} Feb 20 10:08:21 crc kubenswrapper[4962]: I0220 10:08:21.527162 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:21 crc kubenswrapper[4962]: I0220 10:08:21.588523 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:21 crc kubenswrapper[4962]: I0220 10:08:21.610064 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" event={"ID":"403ba47d-bbe1-48f6-9382-47f12bbb75ae","Type":"ContainerStarted","Data":"ca63e54f1e75d95b67ff13a5e5f2b314fd34f194df66e73c37fcb9e2815a3ea4"} Feb 20 10:08:21 crc kubenswrapper[4962]: I0220 10:08:21.635034 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" podStartSLOduration=1.8339391539999998 podStartE2EDuration="4.635008708s" podCreationTimestamp="2026-02-20 10:08:17 +0000 UTC" firstStartedPulling="2026-02-20 10:08:18.541645141 +0000 UTC m=+790.124116977" lastFinishedPulling="2026-02-20 10:08:21.342714685 +0000 UTC m=+792.925186531" observedRunningTime="2026-02-20 10:08:21.630750369 +0000 UTC m=+793.213222225" watchObservedRunningTime="2026-02-20 10:08:21.635008708 +0000 UTC m=+793.217480554" Feb 20 10:08:22 crc kubenswrapper[4962]: I0220 10:08:22.618051 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:23 crc kubenswrapper[4962]: I0220 10:08:23.628088 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" event={"ID":"2ae49f4e-271b-40e8-9cfc-9857fc2de6f3","Type":"ContainerStarted","Data":"82f3a2ebf6a19c278b24356ba6079d4de9838d6b6b1315a85e395071abeb8d5c"} Feb 20 10:08:23 crc kubenswrapper[4962]: I0220 10:08:23.653042 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" podStartSLOduration=1.568976641 podStartE2EDuration="5.653020397s" podCreationTimestamp="2026-02-20 10:08:18 +0000 UTC" firstStartedPulling="2026-02-20 10:08:18.918308782 +0000 UTC m=+790.500780638" lastFinishedPulling="2026-02-20 10:08:23.002352528 +0000 UTC m=+794.584824394" observedRunningTime="2026-02-20 10:08:23.649775119 +0000 UTC m=+795.232246975" watchObservedRunningTime="2026-02-20 10:08:23.653020397 +0000 UTC m=+795.235492253" Feb 20 10:08:23 crc kubenswrapper[4962]: I0220 10:08:23.892504 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:23 crc kubenswrapper[4962]: I0220 10:08:23.892748 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gq5bb" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" containerID="cri-o://8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" gracePeriod=2 Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.338980 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.379791 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") pod \"2998604a-1adc-4333-9c8a-a4128085b7ce\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.379982 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") pod \"2998604a-1adc-4333-9c8a-a4128085b7ce\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.380040 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") pod \"2998604a-1adc-4333-9c8a-a4128085b7ce\" (UID: \"2998604a-1adc-4333-9c8a-a4128085b7ce\") " Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.382136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities" (OuterVolumeSpecName: "utilities") pod "2998604a-1adc-4333-9c8a-a4128085b7ce" (UID: "2998604a-1adc-4333-9c8a-a4128085b7ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.386763 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl" (OuterVolumeSpecName: "kube-api-access-r8ttl") pod "2998604a-1adc-4333-9c8a-a4128085b7ce" (UID: "2998604a-1adc-4333-9c8a-a4128085b7ce"). InnerVolumeSpecName "kube-api-access-r8ttl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.483984 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8ttl\" (UniqueName: \"kubernetes.io/projected/2998604a-1adc-4333-9c8a-a4128085b7ce-kube-api-access-r8ttl\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.484229 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.525537 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2998604a-1adc-4333-9c8a-a4128085b7ce" (UID: "2998604a-1adc-4333-9c8a-a4128085b7ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.585929 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2998604a-1adc-4333-9c8a-a4128085b7ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.637864 4962 generic.go:334] "Generic (PLEG): container finished" podID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerID="8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" exitCode=0 Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.637933 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5bb" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.637948 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerDied","Data":"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8"} Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.637999 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5bb" event={"ID":"2998604a-1adc-4333-9c8a-a4128085b7ce","Type":"ContainerDied","Data":"2ceeda16e7e563747039db6289ac67533cb60df02419917452a4675b9ac9c448"} Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.638043 4962 scope.go:117] "RemoveContainer" containerID="8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.638669 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.681799 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.688485 4962 scope.go:117] "RemoveContainer" containerID="daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.700423 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gq5bb"] Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.714715 4962 scope.go:117] "RemoveContainer" containerID="c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.742400 4962 scope.go:117] "RemoveContainer" containerID="8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" Feb 20 10:08:24 crc kubenswrapper[4962]: E0220 10:08:24.743053 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8\": container with ID starting with 8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8 not found: ID does not exist" containerID="8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743100 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8"} err="failed to get container status \"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8\": rpc error: code = NotFound desc = could not find container \"8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8\": container with ID starting with 8cde6cce15995168cb930061906719930b3385da0bad5dadc4d1c784769729d8 not found: ID does not exist" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743135 4962 scope.go:117] "RemoveContainer" containerID="daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef" Feb 20 10:08:24 crc kubenswrapper[4962]: E0220 10:08:24.743468 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef\": container with ID starting with daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef not found: ID does not exist" containerID="daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743490 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef"} err="failed to get container status \"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef\": rpc error: code = NotFound desc = could not find container \"daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef\": container with ID starting with daa1d98a829416c40ab5bfdf8d68b1d7940426bd0365fcc1227a7a69bb565cef not found: ID does not exist" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743502 4962 scope.go:117] "RemoveContainer" containerID="c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301" Feb 20 10:08:24 crc kubenswrapper[4962]: E0220 10:08:24.743752 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301\": container with ID starting with c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301 not found: ID does not exist" containerID="c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301" Feb 20 10:08:24 crc kubenswrapper[4962]: I0220 10:08:24.743774 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301"} err="failed to get container status \"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301\": rpc error: code = NotFound desc = could not find container \"c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301\": container with ID starting with c5bb43db8b6114f689975391455ee8bcc4084bd7ca471b2d2e1e7aa29cfbc301 not found: ID does not exist" Feb 20 10:08:25 crc kubenswrapper[4962]: I0220 10:08:25.150414 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" path="/var/lib/kubelet/pods/2998604a-1adc-4333-9c8a-a4128085b7ce/volumes" Feb 20 10:08:38 crc kubenswrapper[4962]: I0220 10:08:38.457712 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79fb478cb4-wmzpd" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.059106 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7964458f8b-6fxbj" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.827019 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zf82t"] Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.827782 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="extract-utilities" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.827815 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="extract-utilities" Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.827837 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="extract-content" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.827848 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="extract-content" Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.827865 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.827876 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.828066 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2998604a-1adc-4333-9c8a-a4128085b7ce" containerName="registry-server" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.831082 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.835074 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-mrvb4" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.835247 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.837200 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.848398 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.849569 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.852738 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.864342 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890032 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics-certs\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-reloader\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890121 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66zlq\" (UniqueName: \"kubernetes.io/projected/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-kube-api-access-66zlq\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzt9w\" (UniqueName: \"kubernetes.io/projected/7135845d-f595-42df-9773-7701c9a0b2e2-kube-api-access-xzt9w\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890216 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890234 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-startup\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890277 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-sockets\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-conf\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.890490 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.924233 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-rx2lw"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.925391 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.930482 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.930700 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-g9t8n" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.931004 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.932006 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.942382 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-29wdn"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.948646 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.954424 4962 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-startup\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992800 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metallb-excludel2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992829 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-sockets\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-conf\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992875 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metrics-certs\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992899 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h7q4\" (UniqueName: \"kubernetes.io/projected/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-kube-api-access-7h7q4\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.992978 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993000 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993029 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-cert\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics-certs\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993085 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-reloader\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66zlq\" (UniqueName: \"kubernetes.io/projected/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-kube-api-access-66zlq\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993134 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzt9w\" (UniqueName: \"kubernetes.io/projected/7135845d-f595-42df-9773-7701c9a0b2e2-kube-api-access-xzt9w\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993154 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhx2\" (UniqueName: \"kubernetes.io/projected/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-kube-api-access-tmhx2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993174 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.993289 4962 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 20 10:08:58 crc kubenswrapper[4962]: E0220 10:08:58.993347 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert podName:7135845d-f595-42df-9773-7701c9a0b2e2 nodeName:}" failed. No retries permitted until 2026-02-20 10:08:59.493329323 +0000 UTC m=+831.075801169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert") pod "frr-k8s-webhook-server-78b44bf5bb-hb87m" (UID: "7135845d-f595-42df-9773-7701c9a0b2e2") : secret "frr-k8s-webhook-server-cert" not found Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993708 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-sockets\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993901 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-reloader\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.993960 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-29wdn"] Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.994005 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-conf\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.994329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-frr-startup\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:58 crc kubenswrapper[4962]: I0220 10:08:58.996337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.009090 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-metrics-certs\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.009380 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzt9w\" (UniqueName: \"kubernetes.io/projected/7135845d-f595-42df-9773-7701c9a0b2e2-kube-api-access-xzt9w\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.011363 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66zlq\" (UniqueName: \"kubernetes.io/projected/3eb8e16a-ffc3-4756-a3ee-96473eecf85d-kube-api-access-66zlq\") pod \"frr-k8s-zf82t\" (UID: \"3eb8e16a-ffc3-4756-a3ee-96473eecf85d\") " pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094442 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhx2\" (UniqueName: \"kubernetes.io/projected/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-kube-api-access-tmhx2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metallb-excludel2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094568 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metrics-certs\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094637 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h7q4\" (UniqueName: \"kubernetes.io/projected/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-kube-api-access-7h7q4\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094655 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094677 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.094701 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-cert\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.094820 4962 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.094922 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs podName:7af7ee52-8865-48ce-85e5-7b62fb0d67d3 nodeName:}" failed. No retries permitted until 2026-02-20 10:08:59.594892813 +0000 UTC m=+831.177364669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs") pod "controller-69bbfbf88f-29wdn" (UID: "7af7ee52-8865-48ce-85e5-7b62fb0d67d3") : secret "controller-certs-secret" not found Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.094962 4962 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.095112 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist podName:c8b5efc7-c8c4-4492-a8a9-31eaecfa8374 nodeName:}" failed. No retries permitted until 2026-02-20 10:08:59.595080808 +0000 UTC m=+831.177552764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist") pod "speaker-rx2lw" (UID: "c8b5efc7-c8c4-4492-a8a9-31eaecfa8374") : secret "metallb-memberlist" not found Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.095279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metallb-excludel2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.101117 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-cert\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.109981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-metrics-certs\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.114174 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhx2\" (UniqueName: \"kubernetes.io/projected/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-kube-api-access-tmhx2\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.117323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h7q4\" (UniqueName: \"kubernetes.io/projected/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-kube-api-access-7h7q4\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.157534 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.502935 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.510997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7135845d-f595-42df-9773-7701c9a0b2e2-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-hb87m\" (UID: \"7135845d-f595-42df-9773-7701c9a0b2e2\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.606806 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.606884 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.607054 4962 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 10:08:59 crc kubenswrapper[4962]: E0220 10:08:59.607164 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist podName:c8b5efc7-c8c4-4492-a8a9-31eaecfa8374 nodeName:}" failed. No retries permitted until 2026-02-20 10:09:00.607136265 +0000 UTC m=+832.189608311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist") pod "speaker-rx2lw" (UID: "c8b5efc7-c8c4-4492-a8a9-31eaecfa8374") : secret "metallb-memberlist" not found Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.612310 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7af7ee52-8865-48ce-85e5-7b62fb0d67d3-metrics-certs\") pod \"controller-69bbfbf88f-29wdn\" (UID: \"7af7ee52-8865-48ce-85e5-7b62fb0d67d3\") " pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.765535 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.857634 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"88001c889bfe063aa4b9580aa7618db487b62b3d5d5c8321559e84b61c43f59a"} Feb 20 10:08:59 crc kubenswrapper[4962]: I0220 10:08:59.864296 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.211042 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m"] Feb 20 10:09:00 crc kubenswrapper[4962]: W0220 10:09:00.211899 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7135845d_f595_42df_9773_7701c9a0b2e2.slice/crio-8d2d88e7d76f61bc68fdda8fe64da2313a3bf100985cef149b302f8765d17754 WatchSource:0}: Error finding container 8d2d88e7d76f61bc68fdda8fe64da2313a3bf100985cef149b302f8765d17754: Status 404 returned error can't find the container with id 8d2d88e7d76f61bc68fdda8fe64da2313a3bf100985cef149b302f8765d17754 Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.322486 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-29wdn"] Feb 20 10:09:00 crc kubenswrapper[4962]: W0220 10:09:00.332758 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af7ee52_8865_48ce_85e5_7b62fb0d67d3.slice/crio-7f06a781a6c2dc25020e232dd422a30c8758e0e89951b5de20b6fbcd9aad2c39 WatchSource:0}: Error finding container 7f06a781a6c2dc25020e232dd422a30c8758e0e89951b5de20b6fbcd9aad2c39: Status 404 returned error can't find the container with id 7f06a781a6c2dc25020e232dd422a30c8758e0e89951b5de20b6fbcd9aad2c39 Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.619831 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.625675 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c8b5efc7-c8c4-4492-a8a9-31eaecfa8374-memberlist\") pod \"speaker-rx2lw\" (UID: \"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374\") " pod="metallb-system/speaker-rx2lw" Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.743361 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rx2lw" Feb 20 10:09:00 crc kubenswrapper[4962]: W0220 10:09:00.769444 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b5efc7_c8c4_4492_a8a9_31eaecfa8374.slice/crio-53b5611abf50fb4282b4b407ac315d043c6e0565d56b8fe7d179e0cac58ccbae WatchSource:0}: Error finding container 53b5611abf50fb4282b4b407ac315d043c6e0565d56b8fe7d179e0cac58ccbae: Status 404 returned error can't find the container with id 53b5611abf50fb4282b4b407ac315d043c6e0565d56b8fe7d179e0cac58ccbae Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.866849 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-29wdn" event={"ID":"7af7ee52-8865-48ce-85e5-7b62fb0d67d3","Type":"ContainerStarted","Data":"573220f51e873ebd9a38f2bb1b436efcadf5368d005fc13f5d3fc5b28e0c1024"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.866919 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-29wdn" event={"ID":"7af7ee52-8865-48ce-85e5-7b62fb0d67d3","Type":"ContainerStarted","Data":"0e4f4f5b504f209130a89ed33701043c3e959c6d3f5bd517720c6f71e47d8e68"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.866937 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-29wdn" event={"ID":"7af7ee52-8865-48ce-85e5-7b62fb0d67d3","Type":"ContainerStarted","Data":"7f06a781a6c2dc25020e232dd422a30c8758e0e89951b5de20b6fbcd9aad2c39"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.866990 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.869461 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rx2lw" event={"ID":"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374","Type":"ContainerStarted","Data":"53b5611abf50fb4282b4b407ac315d043c6e0565d56b8fe7d179e0cac58ccbae"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.871260 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" event={"ID":"7135845d-f595-42df-9773-7701c9a0b2e2","Type":"ContainerStarted","Data":"8d2d88e7d76f61bc68fdda8fe64da2313a3bf100985cef149b302f8765d17754"} Feb 20 10:09:00 crc kubenswrapper[4962]: I0220 10:09:00.888760 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-29wdn" podStartSLOduration=2.888732746 podStartE2EDuration="2.888732746s" podCreationTimestamp="2026-02-20 10:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:09:00.881938889 +0000 UTC m=+832.464410755" watchObservedRunningTime="2026-02-20 10:09:00.888732746 +0000 UTC m=+832.471204592" Feb 20 10:09:01 crc kubenswrapper[4962]: I0220 10:09:01.903669 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rx2lw" event={"ID":"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374","Type":"ContainerStarted","Data":"37cb2dbb2252525197eeb148128697f9a359ceef5ca33e7792293725986d53b3"} Feb 20 10:09:01 crc kubenswrapper[4962]: I0220 10:09:01.903940 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rx2lw" event={"ID":"c8b5efc7-c8c4-4492-a8a9-31eaecfa8374","Type":"ContainerStarted","Data":"d28244abb1b7d99b98d17d0d2301f797137939e540cead347e8472899ceb4720"} Feb 20 10:09:01 crc kubenswrapper[4962]: I0220 10:09:01.904846 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-rx2lw" Feb 20 10:09:01 crc kubenswrapper[4962]: I0220 10:09:01.936056 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-rx2lw" podStartSLOduration=3.936038302 podStartE2EDuration="3.936038302s" podCreationTimestamp="2026-02-20 10:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:09:01.935544157 +0000 UTC m=+833.518016003" watchObservedRunningTime="2026-02-20 10:09:01.936038302 +0000 UTC m=+833.518510148" Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.961031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" event={"ID":"7135845d-f595-42df-9773-7701c9a0b2e2","Type":"ContainerStarted","Data":"81cd0f3e683d45f7fa282b87fd7e0f42002369ac374a976748add17cc626018f"} Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.961763 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.963618 4962 generic.go:334] "Generic (PLEG): container finished" podID="3eb8e16a-ffc3-4756-a3ee-96473eecf85d" containerID="6cea3ca8b602ce4569b14e9517c3e36ed0bc21260e7a380fb219062b228d7f8c" exitCode=0 Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.963682 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerDied","Data":"6cea3ca8b602ce4569b14e9517c3e36ed0bc21260e7a380fb219062b228d7f8c"} Feb 20 10:09:06 crc kubenswrapper[4962]: I0220 10:09:06.989432 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" podStartSLOduration=2.730846729 podStartE2EDuration="8.98941483s" podCreationTimestamp="2026-02-20 10:08:58 +0000 UTC" firstStartedPulling="2026-02-20 10:09:00.214386559 +0000 UTC m=+831.796858405" lastFinishedPulling="2026-02-20 10:09:06.47295466 +0000 UTC m=+838.055426506" observedRunningTime="2026-02-20 10:09:06.984973375 +0000 UTC m=+838.567445221" watchObservedRunningTime="2026-02-20 10:09:06.98941483 +0000 UTC m=+838.571886666" Feb 20 10:09:07 crc kubenswrapper[4962]: I0220 10:09:07.972895 4962 generic.go:334] "Generic (PLEG): container finished" podID="3eb8e16a-ffc3-4756-a3ee-96473eecf85d" containerID="886c4bacfd41c907d3736bcbd045ec59d695c6b426c64992c62e0b0c83dc625b" exitCode=0 Feb 20 10:09:07 crc kubenswrapper[4962]: I0220 10:09:07.972985 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerDied","Data":"886c4bacfd41c907d3736bcbd045ec59d695c6b426c64992c62e0b0c83dc625b"} Feb 20 10:09:08 crc kubenswrapper[4962]: I0220 10:09:08.988992 4962 generic.go:334] "Generic (PLEG): container finished" podID="3eb8e16a-ffc3-4756-a3ee-96473eecf85d" containerID="8d103a0e98d7a3ece8599bebac16aff98d0de848875efe9471b4c419d46dfeaa" exitCode=0 Feb 20 10:09:08 crc kubenswrapper[4962]: I0220 10:09:08.989068 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerDied","Data":"8d103a0e98d7a3ece8599bebac16aff98d0de848875efe9471b4c419d46dfeaa"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010143 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"40616a045725a9afa05b8488394bad8c6195fdf8f1f6ee164c766cd56717cee4"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010685 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"c48504fd838e904c7b1daad5c78619d61ec2136d7312c27042b01a98bef0fb8c"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"2f9b7b3f82f874d5e373b8865caf55f62599fe461f368ef20a048b541beab728"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010704 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"d9b5b9275fc2bd36861964938112a3904a31bd2b563c1c792949a36580d805d0"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.010713 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"e0fa5df594f5b3065d16fca7a01ffee712861bd476ce54376311ec19c8621228"} Feb 20 10:09:10 crc kubenswrapper[4962]: I0220 10:09:10.748423 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-rx2lw" Feb 20 10:09:11 crc kubenswrapper[4962]: I0220 10:09:11.024764 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zf82t" event={"ID":"3eb8e16a-ffc3-4756-a3ee-96473eecf85d","Type":"ContainerStarted","Data":"49ae5aadd012b574e8b8bdfd11b85cf59cc7613ee4962ea13e534d14fe466c03"} Feb 20 10:09:11 crc kubenswrapper[4962]: I0220 10:09:11.025090 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:09:11 crc kubenswrapper[4962]: I0220 10:09:11.063808 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zf82t" podStartSLOduration=5.884607316 podStartE2EDuration="13.063776032s" podCreationTimestamp="2026-02-20 10:08:58 +0000 UTC" firstStartedPulling="2026-02-20 10:08:59.285343058 +0000 UTC m=+830.867814904" lastFinishedPulling="2026-02-20 10:09:06.464511774 +0000 UTC m=+838.046983620" observedRunningTime="2026-02-20 10:09:11.060215205 +0000 UTC m=+842.642687091" watchObservedRunningTime="2026-02-20 10:09:11.063776032 +0000 UTC m=+842.646247908" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.622614 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z"] Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.623837 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.625951 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.673541 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z"] Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.718224 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.718306 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.718395 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820055 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.820824 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.847702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:12 crc kubenswrapper[4962]: I0220 10:09:12.938473 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:13 crc kubenswrapper[4962]: I0220 10:09:13.148954 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z"] Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.050980 4962 generic.go:334] "Generic (PLEG): container finished" podID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerID="4aeee85e2c14d3a7fe2f8c49af3f661e1408188a571d437d5da3e2e875197af8" exitCode=0 Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.051125 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerDied","Data":"4aeee85e2c14d3a7fe2f8c49af3f661e1408188a571d437d5da3e2e875197af8"} Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.052100 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerStarted","Data":"a83ddc1d1dbabac527837db39de85778c0930ece92079843f18edda6431e7e29"} Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.158222 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:09:14 crc kubenswrapper[4962]: I0220 10:09:14.216376 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.090489 4962 generic.go:334] "Generic (PLEG): container finished" podID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerID="6b15885b4927d9d19a457c546d9edc1cfe227be824780684ea21ca71fd55db66" exitCode=0 Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.090705 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerDied","Data":"6b15885b4927d9d19a457c546d9edc1cfe227be824780684ea21ca71fd55db66"} Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.165312 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zf82t" Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.774663 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-hb87m" Feb 20 10:09:19 crc kubenswrapper[4962]: I0220 10:09:19.870923 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-29wdn" Feb 20 10:09:20 crc kubenswrapper[4962]: I0220 10:09:20.105668 4962 generic.go:334] "Generic (PLEG): container finished" podID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerID="1d9fa17133e4014eadec9971d9d94641bdabf9b142de77f71866cc7b4c033342" exitCode=0 Feb 20 10:09:20 crc kubenswrapper[4962]: I0220 10:09:20.105787 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerDied","Data":"1d9fa17133e4014eadec9971d9d94641bdabf9b142de77f71866cc7b4c033342"} Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.442761 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.564941 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") pod \"9536c987-ff07-45d5-b8c8-12cfe3019427\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.565154 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") pod \"9536c987-ff07-45d5-b8c8-12cfe3019427\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.565184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") pod \"9536c987-ff07-45d5-b8c8-12cfe3019427\" (UID: \"9536c987-ff07-45d5-b8c8-12cfe3019427\") " Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.566233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle" (OuterVolumeSpecName: "bundle") pod "9536c987-ff07-45d5-b8c8-12cfe3019427" (UID: "9536c987-ff07-45d5-b8c8-12cfe3019427"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.573757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9" (OuterVolumeSpecName: "kube-api-access-l5vl9") pod "9536c987-ff07-45d5-b8c8-12cfe3019427" (UID: "9536c987-ff07-45d5-b8c8-12cfe3019427"). InnerVolumeSpecName "kube-api-access-l5vl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.586776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util" (OuterVolumeSpecName: "util") pod "9536c987-ff07-45d5-b8c8-12cfe3019427" (UID: "9536c987-ff07-45d5-b8c8-12cfe3019427"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.668266 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5vl9\" (UniqueName: \"kubernetes.io/projected/9536c987-ff07-45d5-b8c8-12cfe3019427-kube-api-access-l5vl9\") on node \"crc\" DevicePath \"\"" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.668638 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-util\") on node \"crc\" DevicePath \"\"" Feb 20 10:09:21 crc kubenswrapper[4962]: I0220 10:09:21.668649 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9536c987-ff07-45d5-b8c8-12cfe3019427-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:09:22 crc kubenswrapper[4962]: I0220 10:09:22.127284 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" event={"ID":"9536c987-ff07-45d5-b8c8-12cfe3019427","Type":"ContainerDied","Data":"a83ddc1d1dbabac527837db39de85778c0930ece92079843f18edda6431e7e29"} Feb 20 10:09:22 crc kubenswrapper[4962]: I0220 10:09:22.127356 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83ddc1d1dbabac527837db39de85778c0930ece92079843f18edda6431e7e29" Feb 20 10:09:22 crc kubenswrapper[4962]: I0220 10:09:22.127410 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.780131 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w"] Feb 20 10:09:26 crc kubenswrapper[4962]: E0220 10:09:26.780827 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="extract" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.780847 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="extract" Feb 20 10:09:26 crc kubenswrapper[4962]: E0220 10:09:26.780879 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="pull" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.780887 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="pull" Feb 20 10:09:26 crc kubenswrapper[4962]: E0220 10:09:26.780899 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="util" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.780906 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="util" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.781027 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9536c987-ff07-45d5-b8c8-12cfe3019427" containerName="extract" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.781555 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.784923 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.786719 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-fsqpv" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.789252 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.845126 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w"] Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.947673 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxmk\" (UniqueName: \"kubernetes.io/projected/74abbb4a-2e6c-459a-8646-28b2519ca98a-kube-api-access-vsxmk\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:26 crc kubenswrapper[4962]: I0220 10:09:26.947742 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74abbb4a-2e6c-459a-8646-28b2519ca98a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.049888 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74abbb4a-2e6c-459a-8646-28b2519ca98a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.050193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxmk\" (UniqueName: \"kubernetes.io/projected/74abbb4a-2e6c-459a-8646-28b2519ca98a-kube-api-access-vsxmk\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.050714 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74abbb4a-2e6c-459a-8646-28b2519ca98a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.077511 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxmk\" (UniqueName: \"kubernetes.io/projected/74abbb4a-2e6c-459a-8646-28b2519ca98a-kube-api-access-vsxmk\") pod \"cert-manager-operator-controller-manager-66c8bdd694-vfv7w\" (UID: \"74abbb4a-2e6c-459a-8646-28b2519ca98a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.097888 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" Feb 20 10:09:27 crc kubenswrapper[4962]: I0220 10:09:27.540520 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w"] Feb 20 10:09:27 crc kubenswrapper[4962]: W0220 10:09:27.551509 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74abbb4a_2e6c_459a_8646_28b2519ca98a.slice/crio-e0db397a7d6c731596da26dd7fa014ab2c07bae7626c76ecb1e61460dc6efa2e WatchSource:0}: Error finding container e0db397a7d6c731596da26dd7fa014ab2c07bae7626c76ecb1e61460dc6efa2e: Status 404 returned error can't find the container with id e0db397a7d6c731596da26dd7fa014ab2c07bae7626c76ecb1e61460dc6efa2e Feb 20 10:09:28 crc kubenswrapper[4962]: I0220 10:09:28.189575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" event={"ID":"74abbb4a-2e6c-459a-8646-28b2519ca98a","Type":"ContainerStarted","Data":"e0db397a7d6c731596da26dd7fa014ab2c07bae7626c76ecb1e61460dc6efa2e"} Feb 20 10:09:32 crc kubenswrapper[4962]: I0220 10:09:32.226389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" event={"ID":"74abbb4a-2e6c-459a-8646-28b2519ca98a","Type":"ContainerStarted","Data":"c6efaed722eff0e6ecc55d72bb48f6b38d1ec5846343e7e3f09fc5b3b3d35a5b"} Feb 20 10:09:32 crc kubenswrapper[4962]: I0220 10:09:32.253047 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-vfv7w" podStartSLOduration=2.715469846 podStartE2EDuration="6.253027125s" podCreationTimestamp="2026-02-20 10:09:26 +0000 UTC" firstStartedPulling="2026-02-20 10:09:27.55482445 +0000 UTC m=+859.137296296" lastFinishedPulling="2026-02-20 10:09:31.092381719 +0000 UTC m=+862.674853575" observedRunningTime="2026-02-20 10:09:32.249311804 +0000 UTC m=+863.831783650" watchObservedRunningTime="2026-02-20 10:09:32.253027125 +0000 UTC m=+863.835498971" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.072227 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-t5nv4"] Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.073439 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.075935 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.077937 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-n7fx8" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.083377 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.083963 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-t5nv4"] Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.181534 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqf7j\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-kube-api-access-qqf7j\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.181657 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.283256 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqf7j\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-kube-api-access-qqf7j\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.283401 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.301258 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.302192 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqf7j\" (UniqueName: \"kubernetes.io/projected/0d86f751-d081-47b7-a623-a9cc14ab43f7-kube-api-access-qqf7j\") pod \"cert-manager-webhook-6888856db4-t5nv4\" (UID: \"0d86f751-d081-47b7-a623-a9cc14ab43f7\") " pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.389624 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:35 crc kubenswrapper[4962]: I0220 10:09:35.813971 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-t5nv4"] Feb 20 10:09:36 crc kubenswrapper[4962]: I0220 10:09:36.251138 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" event={"ID":"0d86f751-d081-47b7-a623-a9cc14ab43f7","Type":"ContainerStarted","Data":"87c431b9e2a9770b37d6b34a97b11944b5042bd6cd73b84af91fd09b7ca9b405"} Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.357675 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lvkdh"] Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.358917 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.361900 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wnw7f" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.373153 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lvkdh"] Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.517384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg8cd\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-kube-api-access-wg8cd\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.517438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.619194 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg8cd\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-kube-api-access-wg8cd\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.619251 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.685737 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg8cd\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-kube-api-access-wg8cd\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.686531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lvkdh\" (UID: \"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:37 crc kubenswrapper[4962]: I0220 10:09:37.986314 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" Feb 20 10:09:38 crc kubenswrapper[4962]: I0220 10:09:38.238111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lvkdh"] Feb 20 10:09:38 crc kubenswrapper[4962]: I0220 10:09:38.268262 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" event={"ID":"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3","Type":"ContainerStarted","Data":"2b17b3521afe03e996822dd9946a92c6704e68baedc01dcb608100b67f0b1aa1"} Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.288906 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" event={"ID":"4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3","Type":"ContainerStarted","Data":"06a45cebf86a3e6873275b4afae9441485ff21c7e7aab0713828afea39bb6a78"} Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.291522 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" event={"ID":"0d86f751-d081-47b7-a623-a9cc14ab43f7","Type":"ContainerStarted","Data":"126d53205b09856ec052f966183bef2386f5de5d7433b4c517ad8d0e9e1008b2"} Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.292083 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.314956 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-lvkdh" podStartSLOduration=1.792623007 podStartE2EDuration="4.31493072s" podCreationTimestamp="2026-02-20 10:09:37 +0000 UTC" firstStartedPulling="2026-02-20 10:09:38.249585345 +0000 UTC m=+869.832057191" lastFinishedPulling="2026-02-20 10:09:40.771893068 +0000 UTC m=+872.354364904" observedRunningTime="2026-02-20 10:09:41.308179458 +0000 UTC m=+872.890651314" watchObservedRunningTime="2026-02-20 10:09:41.31493072 +0000 UTC m=+872.897402586" Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.343670 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" podStartSLOduration=1.389267471 podStartE2EDuration="6.343645049s" podCreationTimestamp="2026-02-20 10:09:35 +0000 UTC" firstStartedPulling="2026-02-20 10:09:35.823854189 +0000 UTC m=+867.406326035" lastFinishedPulling="2026-02-20 10:09:40.778231767 +0000 UTC m=+872.360703613" observedRunningTime="2026-02-20 10:09:41.343310519 +0000 UTC m=+872.925782395" watchObservedRunningTime="2026-02-20 10:09:41.343645049 +0000 UTC m=+872.926116905" Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.507816 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:09:41 crc kubenswrapper[4962]: I0220 10:09:41.507891 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.207227 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-ctc7p"] Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.209248 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.211960 4962 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bc9pb" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.222708 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-ctc7p"] Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.341816 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhbr8\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-kube-api-access-jhbr8\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.341874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-bound-sa-token\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.395289 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-t5nv4" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.443051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhbr8\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-kube-api-access-jhbr8\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.443155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-bound-sa-token\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.469283 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-bound-sa-token\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.482740 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhbr8\" (UniqueName: \"kubernetes.io/projected/41c6ef1c-4069-44b1-a0ba-de5e820a630c-kube-api-access-jhbr8\") pod \"cert-manager-545d4d4674-ctc7p\" (UID: \"41c6ef1c-4069-44b1-a0ba-de5e820a630c\") " pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.526156 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-ctc7p" Feb 20 10:09:45 crc kubenswrapper[4962]: I0220 10:09:45.976938 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-ctc7p"] Feb 20 10:09:46 crc kubenswrapper[4962]: I0220 10:09:46.328778 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-ctc7p" event={"ID":"41c6ef1c-4069-44b1-a0ba-de5e820a630c","Type":"ContainerStarted","Data":"3cfa1d1bb027eaebb8a5d6e4408907632e040da623c8f1bf1860cc2ee622c7e2"} Feb 20 10:09:46 crc kubenswrapper[4962]: I0220 10:09:46.329312 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-ctc7p" event={"ID":"41c6ef1c-4069-44b1-a0ba-de5e820a630c","Type":"ContainerStarted","Data":"179085b91e733d55c7c4f421cf080d8c3f1001811031e19a46d6187d0fb17bda"} Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.386773 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-ctc7p" podStartSLOduration=4.386743671 podStartE2EDuration="4.386743671s" podCreationTimestamp="2026-02-20 10:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:09:46.368982589 +0000 UTC m=+877.951454515" watchObservedRunningTime="2026-02-20 10:09:49.386743671 +0000 UTC m=+880.969215527" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.392360 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.393422 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.397802 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.397821 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-shvls" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.398076 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.409408 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.514694 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") pod \"openstack-operator-index-dtbt8\" (UID: \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\") " pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.617171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") pod \"openstack-operator-index-dtbt8\" (UID: \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\") " pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.639575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") pod \"openstack-operator-index-dtbt8\" (UID: \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\") " pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:49 crc kubenswrapper[4962]: I0220 10:09:49.758463 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:50 crc kubenswrapper[4962]: I0220 10:09:50.000310 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:50 crc kubenswrapper[4962]: I0220 10:09:50.361917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dtbt8" event={"ID":"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd","Type":"ContainerStarted","Data":"3852b8283668d47a9524139d85fe1441b979672d19315ed7bbee812f03c2018e"} Feb 20 10:09:52 crc kubenswrapper[4962]: I0220 10:09:52.382526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dtbt8" event={"ID":"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd","Type":"ContainerStarted","Data":"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae"} Feb 20 10:09:52 crc kubenswrapper[4962]: I0220 10:09:52.412403 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dtbt8" podStartSLOduration=2.537836082 podStartE2EDuration="3.41237337s" podCreationTimestamp="2026-02-20 10:09:49 +0000 UTC" firstStartedPulling="2026-02-20 10:09:50.014850519 +0000 UTC m=+881.597322365" lastFinishedPulling="2026-02-20 10:09:50.889387807 +0000 UTC m=+882.471859653" observedRunningTime="2026-02-20 10:09:52.405342259 +0000 UTC m=+883.987814135" watchObservedRunningTime="2026-02-20 10:09:52.41237337 +0000 UTC m=+883.994845256" Feb 20 10:09:53 crc kubenswrapper[4962]: I0220 10:09:53.949790 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.408024 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dtbt8" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerName="registry-server" containerID="cri-o://55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" gracePeriod=2 Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.788507 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-t9zxk"] Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.790487 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t9zxk"] Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.790645 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.909223 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tkk4\" (UniqueName: \"kubernetes.io/projected/46f437ac-c97a-4af9-92e7-6bec63b7d8d8-kube-api-access-8tkk4\") pod \"openstack-operator-index-t9zxk\" (UID: \"46f437ac-c97a-4af9-92e7-6bec63b7d8d8\") " pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:54 crc kubenswrapper[4962]: I0220 10:09:54.911283 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.011200 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") pod \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\" (UID: \"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd\") " Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.011794 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tkk4\" (UniqueName: \"kubernetes.io/projected/46f437ac-c97a-4af9-92e7-6bec63b7d8d8-kube-api-access-8tkk4\") pod \"openstack-operator-index-t9zxk\" (UID: \"46f437ac-c97a-4af9-92e7-6bec63b7d8d8\") " pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.023710 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts" (OuterVolumeSpecName: "kube-api-access-nn9ts") pod "c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" (UID: "c3bc0b8f-a3b2-4549-aa20-dc609d7965fd"). InnerVolumeSpecName "kube-api-access-nn9ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.036124 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tkk4\" (UniqueName: \"kubernetes.io/projected/46f437ac-c97a-4af9-92e7-6bec63b7d8d8-kube-api-access-8tkk4\") pod \"openstack-operator-index-t9zxk\" (UID: \"46f437ac-c97a-4af9-92e7-6bec63b7d8d8\") " pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.113616 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn9ts\" (UniqueName: \"kubernetes.io/projected/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd-kube-api-access-nn9ts\") on node \"crc\" DevicePath \"\"" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.123993 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.374883 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t9zxk"] Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.423642 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9zxk" event={"ID":"46f437ac-c97a-4af9-92e7-6bec63b7d8d8","Type":"ContainerStarted","Data":"818c2adbc401eff459d353ae2c94eb60c1b43885ba912f1729e7826b8c860b79"} Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428390 4962 generic.go:334] "Generic (PLEG): container finished" podID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerID="55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" exitCode=0 Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428429 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dtbt8" event={"ID":"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd","Type":"ContainerDied","Data":"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae"} Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428463 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dtbt8" event={"ID":"c3bc0b8f-a3b2-4549-aa20-dc609d7965fd","Type":"ContainerDied","Data":"3852b8283668d47a9524139d85fe1441b979672d19315ed7bbee812f03c2018e"} Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428488 4962 scope.go:117] "RemoveContainer" containerID="55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.428521 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dtbt8" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.456138 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.462633 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dtbt8"] Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.468321 4962 scope.go:117] "RemoveContainer" containerID="55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" Feb 20 10:09:55 crc kubenswrapper[4962]: E0220 10:09:55.468888 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae\": container with ID starting with 55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae not found: ID does not exist" containerID="55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae" Feb 20 10:09:55 crc kubenswrapper[4962]: I0220 10:09:55.468970 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae"} err="failed to get container status \"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae\": rpc error: code = NotFound desc = could not find container \"55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae\": container with ID starting with 55ff6f06680c1b3e73398f4884762d6ba1d4159d9662c02841d0644f332e41ae not found: ID does not exist" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.439905 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9zxk" event={"ID":"46f437ac-c97a-4af9-92e7-6bec63b7d8d8","Type":"ContainerStarted","Data":"80d7893ce3e78f46e78f8e3a04c9130047c485f6f805e5e8a4af01ba647e8461"} Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.464716 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-t9zxk" podStartSLOduration=1.819098275 podStartE2EDuration="2.464663715s" podCreationTimestamp="2026-02-20 10:09:54 +0000 UTC" firstStartedPulling="2026-02-20 10:09:55.392445514 +0000 UTC m=+886.974917400" lastFinishedPulling="2026-02-20 10:09:56.038010994 +0000 UTC m=+887.620482840" observedRunningTime="2026-02-20 10:09:56.45779669 +0000 UTC m=+888.040268606" watchObservedRunningTime="2026-02-20 10:09:56.464663715 +0000 UTC m=+888.047135601" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.565720 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:09:56 crc kubenswrapper[4962]: E0220 10:09:56.566178 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerName="registry-server" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.566202 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerName="registry-server" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.566439 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" containerName="registry-server" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.568582 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.589786 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.637328 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.637478 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.637562 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.739457 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.739543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.739631 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.740049 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.740209 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.763912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") pod \"redhat-marketplace-bmc5d\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:56 crc kubenswrapper[4962]: I0220 10:09:56.892214 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.148034 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bc0b8f-a3b2-4549-aa20-dc609d7965fd" path="/var/lib/kubelet/pods/c3bc0b8f-a3b2-4549-aa20-dc609d7965fd/volumes" Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.162508 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:09:57 crc kubenswrapper[4962]: W0220 10:09:57.167247 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f794364_dcf5_4d81_9edd_69f7a415540c.slice/crio-1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff WatchSource:0}: Error finding container 1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff: Status 404 returned error can't find the container with id 1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.451115 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerID="6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70" exitCode=0 Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.451238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerDied","Data":"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70"} Feb 20 10:09:57 crc kubenswrapper[4962]: I0220 10:09:57.451587 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerStarted","Data":"1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff"} Feb 20 10:09:58 crc kubenswrapper[4962]: I0220 10:09:58.463017 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerID="da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8" exitCode=0 Feb 20 10:09:58 crc kubenswrapper[4962]: I0220 10:09:58.463082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerDied","Data":"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8"} Feb 20 10:09:59 crc kubenswrapper[4962]: I0220 10:09:59.479020 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerStarted","Data":"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5"} Feb 20 10:09:59 crc kubenswrapper[4962]: I0220 10:09:59.512413 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bmc5d" podStartSLOduration=2.099817244 podStartE2EDuration="3.512377784s" podCreationTimestamp="2026-02-20 10:09:56 +0000 UTC" firstStartedPulling="2026-02-20 10:09:57.45411534 +0000 UTC m=+889.036587186" lastFinishedPulling="2026-02-20 10:09:58.86667585 +0000 UTC m=+890.449147726" observedRunningTime="2026-02-20 10:09:59.505983162 +0000 UTC m=+891.088455048" watchObservedRunningTime="2026-02-20 10:09:59.512377784 +0000 UTC m=+891.094849670" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.770371 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.774231 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.788471 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.888833 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.888903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.888949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.991699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.991784 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.991832 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.992611 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:03 crc kubenswrapper[4962]: I0220 10:10:03.992613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:04 crc kubenswrapper[4962]: I0220 10:10:04.025620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") pod \"certified-operators-q557r\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:04 crc kubenswrapper[4962]: I0220 10:10:04.108755 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:04 crc kubenswrapper[4962]: I0220 10:10:04.450640 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:04 crc kubenswrapper[4962]: I0220 10:10:04.527238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerStarted","Data":"798745ded523c335a02cd6d817703b0235b24b33fe3407309e3c81f693c97266"} Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.125127 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.125190 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.165888 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.534913 4962 generic.go:334] "Generic (PLEG): container finished" podID="05518aab-48c4-4826-89d9-080858755a80" containerID="3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76" exitCode=0 Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.535000 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerDied","Data":"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76"} Feb 20 10:10:05 crc kubenswrapper[4962]: I0220 10:10:05.570447 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-t9zxk" Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.542640 4962 generic.go:334] "Generic (PLEG): container finished" podID="05518aab-48c4-4826-89d9-080858755a80" containerID="d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc" exitCode=0 Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.542686 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerDied","Data":"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc"} Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.892537 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.892856 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:06 crc kubenswrapper[4962]: I0220 10:10:06.969341 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:07 crc kubenswrapper[4962]: I0220 10:10:07.553821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerStarted","Data":"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e"} Feb 20 10:10:07 crc kubenswrapper[4962]: I0220 10:10:07.573782 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q557r" podStartSLOduration=3.11367405 podStartE2EDuration="4.573760513s" podCreationTimestamp="2026-02-20 10:10:03 +0000 UTC" firstStartedPulling="2026-02-20 10:10:05.538165027 +0000 UTC m=+897.120636873" lastFinishedPulling="2026-02-20 10:10:06.99825148 +0000 UTC m=+898.580723336" observedRunningTime="2026-02-20 10:10:07.57368438 +0000 UTC m=+899.156156246" watchObservedRunningTime="2026-02-20 10:10:07.573760513 +0000 UTC m=+899.156232359" Feb 20 10:10:07 crc kubenswrapper[4962]: I0220 10:10:07.627026 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:09 crc kubenswrapper[4962]: I0220 10:10:09.552397 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:10:10 crc kubenswrapper[4962]: I0220 10:10:10.573765 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bmc5d" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="registry-server" containerID="cri-o://0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" gracePeriod=2 Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.040707 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.141483 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") pod \"7f794364-dcf5-4d81-9edd-69f7a415540c\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.141619 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") pod \"7f794364-dcf5-4d81-9edd-69f7a415540c\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.141655 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") pod \"7f794364-dcf5-4d81-9edd-69f7a415540c\" (UID: \"7f794364-dcf5-4d81-9edd-69f7a415540c\") " Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.142878 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities" (OuterVolumeSpecName: "utilities") pod "7f794364-dcf5-4d81-9edd-69f7a415540c" (UID: "7f794364-dcf5-4d81-9edd-69f7a415540c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.149802 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x" (OuterVolumeSpecName: "kube-api-access-qm84x") pod "7f794364-dcf5-4d81-9edd-69f7a415540c" (UID: "7f794364-dcf5-4d81-9edd-69f7a415540c"). InnerVolumeSpecName "kube-api-access-qm84x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.179093 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f794364-dcf5-4d81-9edd-69f7a415540c" (UID: "7f794364-dcf5-4d81-9edd-69f7a415540c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.243001 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.243041 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm84x\" (UniqueName: \"kubernetes.io/projected/7f794364-dcf5-4d81-9edd-69f7a415540c-kube-api-access-qm84x\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.243053 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f794364-dcf5-4d81-9edd-69f7a415540c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.507904 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.507996 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.585564 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerID="0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" exitCode=0 Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.585731 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerDied","Data":"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5"} Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.585809 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmc5d" event={"ID":"7f794364-dcf5-4d81-9edd-69f7a415540c","Type":"ContainerDied","Data":"1b7e213f7e180d7ddad591828ff5c286e7b32af8039116fdf69cf90007ff6bff"} Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.585849 4962 scope.go:117] "RemoveContainer" containerID="0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.586087 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmc5d" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.628808 4962 scope.go:117] "RemoveContainer" containerID="da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.646824 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.658249 4962 scope.go:117] "RemoveContainer" containerID="6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.661328 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmc5d"] Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.699356 4962 scope.go:117] "RemoveContainer" containerID="0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" Feb 20 10:10:11 crc kubenswrapper[4962]: E0220 10:10:11.699823 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5\": container with ID starting with 0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5 not found: ID does not exist" containerID="0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.699946 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5"} err="failed to get container status \"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5\": rpc error: code = NotFound desc = could not find container \"0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5\": container with ID starting with 0234d320c25abd6ff2d0801e4880e1a5ae71776ca5bfec75242e1d30a7b437a5 not found: ID does not exist" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.700039 4962 scope.go:117] "RemoveContainer" containerID="da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8" Feb 20 10:10:11 crc kubenswrapper[4962]: E0220 10:10:11.700282 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8\": container with ID starting with da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8 not found: ID does not exist" containerID="da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.700358 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8"} err="failed to get container status \"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8\": rpc error: code = NotFound desc = could not find container \"da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8\": container with ID starting with da7c55ed4910c6b3b219ba59d6689e76154634f3c86bff6e54e098f8efa2d5b8 not found: ID does not exist" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.700427 4962 scope.go:117] "RemoveContainer" containerID="6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70" Feb 20 10:10:11 crc kubenswrapper[4962]: E0220 10:10:11.700871 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70\": container with ID starting with 6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70 not found: ID does not exist" containerID="6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70" Feb 20 10:10:11 crc kubenswrapper[4962]: I0220 10:10:11.700966 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70"} err="failed to get container status \"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70\": rpc error: code = NotFound desc = could not find container \"6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70\": container with ID starting with 6c77f9aad074c7d0979879b9e785f53f979f985cb5106a715d6af1150b39bd70 not found: ID does not exist" Feb 20 10:10:13 crc kubenswrapper[4962]: I0220 10:10:13.156300 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" path="/var/lib/kubelet/pods/7f794364-dcf5-4d81-9edd-69f7a415540c/volumes" Feb 20 10:10:14 crc kubenswrapper[4962]: I0220 10:10:14.109529 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:14 crc kubenswrapper[4962]: I0220 10:10:14.109628 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:14 crc kubenswrapper[4962]: I0220 10:10:14.161379 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:14 crc kubenswrapper[4962]: I0220 10:10:14.659244 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:15 crc kubenswrapper[4962]: I0220 10:10:15.151375 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:16 crc kubenswrapper[4962]: I0220 10:10:16.626490 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q557r" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="registry-server" containerID="cri-o://df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" gracePeriod=2 Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.204750 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.353525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") pod \"05518aab-48c4-4826-89d9-080858755a80\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.353620 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") pod \"05518aab-48c4-4826-89d9-080858755a80\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.353666 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") pod \"05518aab-48c4-4826-89d9-080858755a80\" (UID: \"05518aab-48c4-4826-89d9-080858755a80\") " Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.354932 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities" (OuterVolumeSpecName: "utilities") pod "05518aab-48c4-4826-89d9-080858755a80" (UID: "05518aab-48c4-4826-89d9-080858755a80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.363946 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7" (OuterVolumeSpecName: "kube-api-access-mtrc7") pod "05518aab-48c4-4826-89d9-080858755a80" (UID: "05518aab-48c4-4826-89d9-080858755a80"). InnerVolumeSpecName "kube-api-access-mtrc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.455578 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtrc7\" (UniqueName: \"kubernetes.io/projected/05518aab-48c4-4826-89d9-080858755a80-kube-api-access-mtrc7\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.455647 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644356 4962 generic.go:334] "Generic (PLEG): container finished" podID="05518aab-48c4-4826-89d9-080858755a80" containerID="df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" exitCode=0 Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644429 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerDied","Data":"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e"} Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644472 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q557r" event={"ID":"05518aab-48c4-4826-89d9-080858755a80","Type":"ContainerDied","Data":"798745ded523c335a02cd6d817703b0235b24b33fe3407309e3c81f693c97266"} Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644528 4962 scope.go:117] "RemoveContainer" containerID="df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.644749 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q557r" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.678891 4962 scope.go:117] "RemoveContainer" containerID="d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.714279 4962 scope.go:117] "RemoveContainer" containerID="3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.742729 4962 scope.go:117] "RemoveContainer" containerID="df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" Feb 20 10:10:17 crc kubenswrapper[4962]: E0220 10:10:17.743612 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e\": container with ID starting with df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e not found: ID does not exist" containerID="df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.743670 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e"} err="failed to get container status \"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e\": rpc error: code = NotFound desc = could not find container \"df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e\": container with ID starting with df112fccab1113ad1acffa6a56df4a3481e97c74c1053e58daaeed37e468b13e not found: ID does not exist" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.743709 4962 scope.go:117] "RemoveContainer" containerID="d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc" Feb 20 10:10:17 crc kubenswrapper[4962]: E0220 10:10:17.744437 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc\": container with ID starting with d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc not found: ID does not exist" containerID="d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.744577 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc"} err="failed to get container status \"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc\": rpc error: code = NotFound desc = could not find container \"d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc\": container with ID starting with d24dac6dc4bdbb2930fde288ac1d991b38456e83a7f368939dc9d231250338fc not found: ID does not exist" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.744678 4962 scope.go:117] "RemoveContainer" containerID="3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76" Feb 20 10:10:17 crc kubenswrapper[4962]: E0220 10:10:17.745878 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76\": container with ID starting with 3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76 not found: ID does not exist" containerID="3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76" Feb 20 10:10:17 crc kubenswrapper[4962]: I0220 10:10:17.745950 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76"} err="failed to get container status \"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76\": rpc error: code = NotFound desc = could not find container \"3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76\": container with ID starting with 3ddabf3a8ceca050c0560ce3a870e21bac05d2d3efefbcdadbc80bdfcaa30e76 not found: ID does not exist" Feb 20 10:10:18 crc kubenswrapper[4962]: I0220 10:10:18.143869 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05518aab-48c4-4826-89d9-080858755a80" (UID: "05518aab-48c4-4826-89d9-080858755a80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:18 crc kubenswrapper[4962]: I0220 10:10:18.168044 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05518aab-48c4-4826-89d9-080858755a80-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:18 crc kubenswrapper[4962]: I0220 10:10:18.287653 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:18 crc kubenswrapper[4962]: I0220 10:10:18.291362 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q557r"] Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.149709 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05518aab-48c4-4826-89d9-080858755a80" path="/var/lib/kubelet/pods/05518aab-48c4-4826-89d9-080858755a80/volumes" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.617976 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg"] Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618496 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="extract-content" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618508 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="extract-content" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618519 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618538 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618549 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="extract-utilities" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618556 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="extract-utilities" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618567 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="extract-utilities" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618572 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="extract-utilities" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618583 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="extract-content" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618604 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="extract-content" Feb 20 10:10:19 crc kubenswrapper[4962]: E0220 10:10:19.618614 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618620 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618859 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f794364-dcf5-4d81-9edd-69f7a415540c" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.618874 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="05518aab-48c4-4826-89d9-080858755a80" containerName="registry-server" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.619723 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.624024 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-jv45q" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.639655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg"] Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.693013 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.693088 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.693231 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.794832 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.794940 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.795003 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.795672 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.795904 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.821102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:19 crc kubenswrapper[4962]: I0220 10:10:19.941222 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:20 crc kubenswrapper[4962]: I0220 10:10:20.401218 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg"] Feb 20 10:10:20 crc kubenswrapper[4962]: W0220 10:10:20.407179 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db1f907_b4ac_45b1_9f38_93727dfde270.slice/crio-cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460 WatchSource:0}: Error finding container cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460: Status 404 returned error can't find the container with id cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460 Feb 20 10:10:20 crc kubenswrapper[4962]: I0220 10:10:20.708184 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerStarted","Data":"835f023c2b9927593635cc48239ac460d0bfa52240f057d6beb80de9047703e6"} Feb 20 10:10:20 crc kubenswrapper[4962]: I0220 10:10:20.708235 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerStarted","Data":"cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460"} Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.362672 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.366030 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.373341 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.419631 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.419685 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.419721 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.521476 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.521524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.521554 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.522066 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.522630 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.558916 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") pod \"community-operators-f74q5\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.689041 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.715347 4962 generic.go:334] "Generic (PLEG): container finished" podID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerID="835f023c2b9927593635cc48239ac460d0bfa52240f057d6beb80de9047703e6" exitCode=0 Feb 20 10:10:21 crc kubenswrapper[4962]: I0220 10:10:21.715499 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerDied","Data":"835f023c2b9927593635cc48239ac460d0bfa52240f057d6beb80de9047703e6"} Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.231777 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.722327 4962 generic.go:334] "Generic (PLEG): container finished" podID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerID="be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055" exitCode=0 Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.722384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerDied","Data":"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055"} Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.723863 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerStarted","Data":"b0bd9dbd43b0d997d93a5c2fdb222f2ed322bda2a2ba98ad81b098c65e32686b"} Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.725658 4962 generic.go:334] "Generic (PLEG): container finished" podID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerID="88dafe47aab03aec8975185374016013491b0dc14ba99fb3f5d221f0618853e5" exitCode=0 Feb 20 10:10:22 crc kubenswrapper[4962]: I0220 10:10:22.725691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerDied","Data":"88dafe47aab03aec8975185374016013491b0dc14ba99fb3f5d221f0618853e5"} Feb 20 10:10:23 crc kubenswrapper[4962]: I0220 10:10:23.738738 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerStarted","Data":"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536"} Feb 20 10:10:23 crc kubenswrapper[4962]: I0220 10:10:23.746917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerDied","Data":"304066fbfec48ab67b477f579d2649c382a5c17b23511952ed6af8766db7a80c"} Feb 20 10:10:23 crc kubenswrapper[4962]: I0220 10:10:23.746743 4962 generic.go:334] "Generic (PLEG): container finished" podID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerID="304066fbfec48ab67b477f579d2649c382a5c17b23511952ed6af8766db7a80c" exitCode=0 Feb 20 10:10:24 crc kubenswrapper[4962]: I0220 10:10:24.757784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerDied","Data":"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536"} Feb 20 10:10:24 crc kubenswrapper[4962]: I0220 10:10:24.757674 4962 generic.go:334] "Generic (PLEG): container finished" podID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerID="4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536" exitCode=0 Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.118116 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.178145 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") pod \"3db1f907-b4ac-45b1-9f38-93727dfde270\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.178189 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") pod \"3db1f907-b4ac-45b1-9f38-93727dfde270\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.178254 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") pod \"3db1f907-b4ac-45b1-9f38-93727dfde270\" (UID: \"3db1f907-b4ac-45b1-9f38-93727dfde270\") " Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.180387 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle" (OuterVolumeSpecName: "bundle") pod "3db1f907-b4ac-45b1-9f38-93727dfde270" (UID: "3db1f907-b4ac-45b1-9f38-93727dfde270"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.184806 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7" (OuterVolumeSpecName: "kube-api-access-hdrp7") pod "3db1f907-b4ac-45b1-9f38-93727dfde270" (UID: "3db1f907-b4ac-45b1-9f38-93727dfde270"). InnerVolumeSpecName "kube-api-access-hdrp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.191850 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util" (OuterVolumeSpecName: "util") pod "3db1f907-b4ac-45b1-9f38-93727dfde270" (UID: "3db1f907-b4ac-45b1-9f38-93727dfde270"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.279453 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdrp7\" (UniqueName: \"kubernetes.io/projected/3db1f907-b4ac-45b1-9f38-93727dfde270-kube-api-access-hdrp7\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.279495 4962 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-util\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.279508 4962 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3db1f907-b4ac-45b1-9f38-93727dfde270-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.771527 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" event={"ID":"3db1f907-b4ac-45b1-9f38-93727dfde270","Type":"ContainerDied","Data":"cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460"} Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.772008 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea4057f785f9404912e3d0af7ea54f26d4046e199b222c808dadd82217cb460" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.771564 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg" Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.775118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerStarted","Data":"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237"} Feb 20 10:10:25 crc kubenswrapper[4962]: I0220 10:10:25.803263 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f74q5" podStartSLOduration=2.3540036349999998 podStartE2EDuration="4.80324475s" podCreationTimestamp="2026-02-20 10:10:21 +0000 UTC" firstStartedPulling="2026-02-20 10:10:22.723736519 +0000 UTC m=+914.306208365" lastFinishedPulling="2026-02-20 10:10:25.172977634 +0000 UTC m=+916.755449480" observedRunningTime="2026-02-20 10:10:25.800736342 +0000 UTC m=+917.383208198" watchObservedRunningTime="2026-02-20 10:10:25.80324475 +0000 UTC m=+917.385716606" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.726849 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2"] Feb 20 10:10:29 crc kubenswrapper[4962]: E0220 10:10:29.727188 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="extract" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727205 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="extract" Feb 20 10:10:29 crc kubenswrapper[4962]: E0220 10:10:29.727221 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="util" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727229 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="util" Feb 20 10:10:29 crc kubenswrapper[4962]: E0220 10:10:29.727246 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="pull" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727255 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="pull" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727399 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db1f907-b4ac-45b1-9f38-93727dfde270" containerName="extract" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.727919 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.732282 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rw5vz" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.764105 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2"] Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.848662 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrdvb\" (UniqueName: \"kubernetes.io/projected/ad363690-9ad6-4f45-ac02-d51ec41d213b-kube-api-access-xrdvb\") pod \"openstack-operator-controller-init-6679bf9b57-n5hm2\" (UID: \"ad363690-9ad6-4f45-ac02-d51ec41d213b\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.950220 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrdvb\" (UniqueName: \"kubernetes.io/projected/ad363690-9ad6-4f45-ac02-d51ec41d213b-kube-api-access-xrdvb\") pod \"openstack-operator-controller-init-6679bf9b57-n5hm2\" (UID: \"ad363690-9ad6-4f45-ac02-d51ec41d213b\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:29 crc kubenswrapper[4962]: I0220 10:10:29.975549 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrdvb\" (UniqueName: \"kubernetes.io/projected/ad363690-9ad6-4f45-ac02-d51ec41d213b-kube-api-access-xrdvb\") pod \"openstack-operator-controller-init-6679bf9b57-n5hm2\" (UID: \"ad363690-9ad6-4f45-ac02-d51ec41d213b\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:30 crc kubenswrapper[4962]: I0220 10:10:30.044580 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:30 crc kubenswrapper[4962]: I0220 10:10:30.491384 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2"] Feb 20 10:10:30 crc kubenswrapper[4962]: I0220 10:10:30.809857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" event={"ID":"ad363690-9ad6-4f45-ac02-d51ec41d213b","Type":"ContainerStarted","Data":"f27c2da4df93f2a4e89660a26a4a7dc15f6c65172f7b6c3b247bf33b3636e709"} Feb 20 10:10:31 crc kubenswrapper[4962]: I0220 10:10:31.689636 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:31 crc kubenswrapper[4962]: I0220 10:10:31.689709 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:31 crc kubenswrapper[4962]: I0220 10:10:31.748254 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:31 crc kubenswrapper[4962]: I0220 10:10:31.865740 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:34 crc kubenswrapper[4962]: I0220 10:10:34.353905 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:34 crc kubenswrapper[4962]: I0220 10:10:34.354832 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f74q5" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="registry-server" containerID="cri-o://789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" gracePeriod=2 Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.422145 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.562892 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") pod \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.563004 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") pod \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.563080 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") pod \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\" (UID: \"bea00b9c-e00f-4cec-b1bf-9955dd868c9c\") " Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.563897 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities" (OuterVolumeSpecName: "utilities") pod "bea00b9c-e00f-4cec-b1bf-9955dd868c9c" (UID: "bea00b9c-e00f-4cec-b1bf-9955dd868c9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.579670 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j" (OuterVolumeSpecName: "kube-api-access-pqb4j") pod "bea00b9c-e00f-4cec-b1bf-9955dd868c9c" (UID: "bea00b9c-e00f-4cec-b1bf-9955dd868c9c"). InnerVolumeSpecName "kube-api-access-pqb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.620271 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bea00b9c-e00f-4cec-b1bf-9955dd868c9c" (UID: "bea00b9c-e00f-4cec-b1bf-9955dd868c9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.664909 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.664941 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.664951 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqb4j\" (UniqueName: \"kubernetes.io/projected/bea00b9c-e00f-4cec-b1bf-9955dd868c9c-kube-api-access-pqb4j\") on node \"crc\" DevicePath \"\"" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.862175 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" event={"ID":"ad363690-9ad6-4f45-ac02-d51ec41d213b","Type":"ContainerStarted","Data":"ee0600b1964e6d09850f03e96e548348a6c0d60851d33fdd4e668f460bbd691b"} Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.862578 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864410 4962 generic.go:334] "Generic (PLEG): container finished" podID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerID="789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" exitCode=0 Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864459 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerDied","Data":"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237"} Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f74q5" event={"ID":"bea00b9c-e00f-4cec-b1bf-9955dd868c9c","Type":"ContainerDied","Data":"b0bd9dbd43b0d997d93a5c2fdb222f2ed322bda2a2ba98ad81b098c65e32686b"} Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864516 4962 scope.go:117] "RemoveContainer" containerID="789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.864729 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f74q5" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.887043 4962 scope.go:117] "RemoveContainer" containerID="4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.898342 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" podStartSLOduration=2.220170803 podStartE2EDuration="6.898324801s" podCreationTimestamp="2026-02-20 10:10:29 +0000 UTC" firstStartedPulling="2026-02-20 10:10:30.527298062 +0000 UTC m=+922.109769928" lastFinishedPulling="2026-02-20 10:10:35.20545208 +0000 UTC m=+926.787923926" observedRunningTime="2026-02-20 10:10:35.898171527 +0000 UTC m=+927.480643373" watchObservedRunningTime="2026-02-20 10:10:35.898324801 +0000 UTC m=+927.480796647" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.911955 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.916059 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f74q5"] Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.919761 4962 scope.go:117] "RemoveContainer" containerID="be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.933967 4962 scope.go:117] "RemoveContainer" containerID="789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" Feb 20 10:10:35 crc kubenswrapper[4962]: E0220 10:10:35.934480 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237\": container with ID starting with 789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237 not found: ID does not exist" containerID="789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.934526 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237"} err="failed to get container status \"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237\": rpc error: code = NotFound desc = could not find container \"789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237\": container with ID starting with 789904d9f8123c27e15533db5d9e26eded2a7f107b9c3247b2da1c85ad4d1237 not found: ID does not exist" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.934553 4962 scope.go:117] "RemoveContainer" containerID="4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536" Feb 20 10:10:35 crc kubenswrapper[4962]: E0220 10:10:35.934945 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536\": container with ID starting with 4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536 not found: ID does not exist" containerID="4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.934967 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536"} err="failed to get container status \"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536\": rpc error: code = NotFound desc = could not find container \"4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536\": container with ID starting with 4111992f8db5a1e1f632ae5114b9cac03a3e43e48caa50440e306831e89af536 not found: ID does not exist" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.934981 4962 scope.go:117] "RemoveContainer" containerID="be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055" Feb 20 10:10:35 crc kubenswrapper[4962]: E0220 10:10:35.935245 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055\": container with ID starting with be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055 not found: ID does not exist" containerID="be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055" Feb 20 10:10:35 crc kubenswrapper[4962]: I0220 10:10:35.935264 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055"} err="failed to get container status \"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055\": rpc error: code = NotFound desc = could not find container \"be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055\": container with ID starting with be29628502f44ad742fdea9b89cfd76dc2d24a07de3ceac73a86bb38b030d055 not found: ID does not exist" Feb 20 10:10:37 crc kubenswrapper[4962]: I0220 10:10:37.147966 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" path="/var/lib/kubelet/pods/bea00b9c-e00f-4cec-b1bf-9955dd868c9c/volumes" Feb 20 10:10:40 crc kubenswrapper[4962]: I0220 10:10:40.047327 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-n5hm2" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.508556 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.509215 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.509318 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.510314 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.510419 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce" gracePeriod=600 Feb 20 10:10:41 crc kubenswrapper[4962]: E0220 10:10:41.684859 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod751d5e0b_919c_4777_8475_ed7214f7647f.slice/crio-conmon-00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.904357 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce" exitCode=0 Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.904422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce"} Feb 20 10:10:41 crc kubenswrapper[4962]: I0220 10:10:41.904692 4962 scope.go:117] "RemoveContainer" containerID="f2df44fd178e1ec428f4f1c5bbae3c8b24f98950b6fec19e9719325e0843ea14" Feb 20 10:10:42 crc kubenswrapper[4962]: I0220 10:10:42.916300 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1"} Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.309993 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5"] Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.310927 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="registry-server" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.310943 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="registry-server" Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.310952 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="extract-content" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.310959 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="extract-content" Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.310986 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="extract-utilities" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.310992 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="extract-utilities" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.311106 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea00b9c-e00f-4cec-b1bf-9955dd868c9c" containerName="registry-server" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.311613 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.314299 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lchp8" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.329826 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.347989 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tdbn\" (UniqueName: \"kubernetes.io/projected/e0560856-ed00-4ea8-8ce7-a801f1d46489-kube-api-access-8tdbn\") pod \"barbican-operator-controller-manager-868647ff47-nhpg5\" (UID: \"e0560856-ed00-4ea8-8ce7-a801f1d46489\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.352663 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.353684 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.356925 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qd5jx" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.357156 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.358274 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.361378 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-d2jgz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.382691 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.389670 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.390841 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.400305 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-ldhhr" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.423355 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.424412 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.428246 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lhcl7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.438659 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450109 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr5ts\" (UniqueName: \"kubernetes.io/projected/ea986843-26e4-4410-a65e-ae51c02dc04c-kube-api-access-sr5ts\") pod \"glance-operator-controller-manager-77987464f4-wcqzf\" (UID: \"ea986843-26e4-4410-a65e-ae51c02dc04c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450167 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxzwg\" (UniqueName: \"kubernetes.io/projected/ac33f7ed-c3f8-487d-89dc-4a614d357b86-kube-api-access-xxzwg\") pod \"cinder-operator-controller-manager-5d946d989d-bsq9n\" (UID: \"ac33f7ed-c3f8-487d-89dc-4a614d357b86\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450217 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tdbn\" (UniqueName: \"kubernetes.io/projected/e0560856-ed00-4ea8-8ce7-a801f1d46489-kube-api-access-8tdbn\") pod \"barbican-operator-controller-manager-868647ff47-nhpg5\" (UID: \"e0560856-ed00-4ea8-8ce7-a801f1d46489\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450258 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5qxh\" (UniqueName: \"kubernetes.io/projected/fee6970c-0ad7-46ea-ab75-dcb7d552ffbb-kube-api-access-s5qxh\") pod \"heat-operator-controller-manager-69f49c598c-75vx4\" (UID: \"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.450284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72m9\" (UniqueName: \"kubernetes.io/projected/cf0e10ba-c175-44c3-9011-6646f21ba334-kube-api-access-h72m9\") pod \"designate-operator-controller-manager-6d8bf5c495-r2t72\" (UID: \"cf0e10ba-c175-44c3-9011-6646f21ba334\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.461063 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.470816 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.482950 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.483970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.489855 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.490610 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.499787 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-fprm6" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.500051 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.500222 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-r9l78" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.500254 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tdbn\" (UniqueName: \"kubernetes.io/projected/e0560856-ed00-4ea8-8ce7-a801f1d46489-kube-api-access-8tdbn\") pod \"barbican-operator-controller-manager-868647ff47-nhpg5\" (UID: \"e0560856-ed00-4ea8-8ce7-a801f1d46489\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.504684 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.505799 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.507989 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rcb6k" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.512087 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.518655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.523298 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.531663 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.534335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.538626 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.542125 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-65srs" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551319 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dmxc\" (UniqueName: \"kubernetes.io/projected/12f33757-f329-47a6-9273-bdeb1558a4d7-kube-api-access-5dmxc\") pod \"horizon-operator-controller-manager-5b9b8895d5-rhhc7\" (UID: \"12f33757-f329-47a6-9273-bdeb1558a4d7\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd9qj\" (UniqueName: \"kubernetes.io/projected/5fec06f1-8ccf-403c-88de-2b581f056802-kube-api-access-kd9qj\") pod \"ironic-operator-controller-manager-554564d7fc-2hg4n\" (UID: \"5fec06f1-8ccf-403c-88de-2b581f056802\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5qxh\" (UniqueName: \"kubernetes.io/projected/fee6970c-0ad7-46ea-ab75-dcb7d552ffbb-kube-api-access-s5qxh\") pod \"heat-operator-controller-manager-69f49c598c-75vx4\" (UID: \"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551456 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h72m9\" (UniqueName: \"kubernetes.io/projected/cf0e10ba-c175-44c3-9011-6646f21ba334-kube-api-access-h72m9\") pod \"designate-operator-controller-manager-6d8bf5c495-r2t72\" (UID: \"cf0e10ba-c175-44c3-9011-6646f21ba334\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551494 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw8bc\" (UniqueName: \"kubernetes.io/projected/7afb870a-75a4-42d5-9704-5cef14dd3ce9-kube-api-access-rw8bc\") pod \"keystone-operator-controller-manager-b4d948c87-jjbwt\" (UID: \"7afb870a-75a4-42d5-9704-5cef14dd3ce9\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551529 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c85cx\" (UniqueName: \"kubernetes.io/projected/0c8c62e9-0201-43a4-b823-82af87a0977e-kube-api-access-c85cx\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551560 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr5ts\" (UniqueName: \"kubernetes.io/projected/ea986843-26e4-4410-a65e-ae51c02dc04c-kube-api-access-sr5ts\") pod \"glance-operator-controller-manager-77987464f4-wcqzf\" (UID: \"ea986843-26e4-4410-a65e-ae51c02dc04c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.551587 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxzwg\" (UniqueName: \"kubernetes.io/projected/ac33f7ed-c3f8-487d-89dc-4a614d357b86-kube-api-access-xxzwg\") pod \"cinder-operator-controller-manager-5d946d989d-bsq9n\" (UID: \"ac33f7ed-c3f8-487d-89dc-4a614d357b86\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.558196 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.559458 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.562991 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-q9kkm" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.570332 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.578054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxzwg\" (UniqueName: \"kubernetes.io/projected/ac33f7ed-c3f8-487d-89dc-4a614d357b86-kube-api-access-xxzwg\") pod \"cinder-operator-controller-manager-5d946d989d-bsq9n\" (UID: \"ac33f7ed-c3f8-487d-89dc-4a614d357b86\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.584108 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5qxh\" (UniqueName: \"kubernetes.io/projected/fee6970c-0ad7-46ea-ab75-dcb7d552ffbb-kube-api-access-s5qxh\") pod \"heat-operator-controller-manager-69f49c598c-75vx4\" (UID: \"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.590973 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.592805 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.597488 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-s97hl" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.599204 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr5ts\" (UniqueName: \"kubernetes.io/projected/ea986843-26e4-4410-a65e-ae51c02dc04c-kube-api-access-sr5ts\") pod \"glance-operator-controller-manager-77987464f4-wcqzf\" (UID: \"ea986843-26e4-4410-a65e-ae51c02dc04c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.606357 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72m9\" (UniqueName: \"kubernetes.io/projected/cf0e10ba-c175-44c3-9011-6646f21ba334-kube-api-access-h72m9\") pod \"designate-operator-controller-manager-6d8bf5c495-r2t72\" (UID: \"cf0e10ba-c175-44c3-9011-6646f21ba334\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.618170 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.634467 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.644766 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.647799 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.659529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.659647 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dmxc\" (UniqueName: \"kubernetes.io/projected/12f33757-f329-47a6-9273-bdeb1558a4d7-kube-api-access-5dmxc\") pod \"horizon-operator-controller-manager-5b9b8895d5-rhhc7\" (UID: \"12f33757-f329-47a6-9273-bdeb1558a4d7\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.660041 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jds7h" Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.660393 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.660455 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:12.160432457 +0000 UTC m=+963.742904303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.668453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd9qj\" (UniqueName: \"kubernetes.io/projected/5fec06f1-8ccf-403c-88de-2b581f056802-kube-api-access-kd9qj\") pod \"ironic-operator-controller-manager-554564d7fc-2hg4n\" (UID: \"5fec06f1-8ccf-403c-88de-2b581f056802\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.668656 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw8bc\" (UniqueName: \"kubernetes.io/projected/7afb870a-75a4-42d5-9704-5cef14dd3ce9-kube-api-access-rw8bc\") pod \"keystone-operator-controller-manager-b4d948c87-jjbwt\" (UID: \"7afb870a-75a4-42d5-9704-5cef14dd3ce9\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.677837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c85cx\" (UniqueName: \"kubernetes.io/projected/0c8c62e9-0201-43a4-b823-82af87a0977e-kube-api-access-c85cx\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.704446 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd9qj\" (UniqueName: \"kubernetes.io/projected/5fec06f1-8ccf-403c-88de-2b581f056802-kube-api-access-kd9qj\") pod \"ironic-operator-controller-manager-554564d7fc-2hg4n\" (UID: \"5fec06f1-8ccf-403c-88de-2b581f056802\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.707776 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.708565 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.711364 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dmxc\" (UniqueName: \"kubernetes.io/projected/12f33757-f329-47a6-9273-bdeb1558a4d7-kube-api-access-5dmxc\") pod \"horizon-operator-controller-manager-5b9b8895d5-rhhc7\" (UID: \"12f33757-f329-47a6-9273-bdeb1558a4d7\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.716290 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw8bc\" (UniqueName: \"kubernetes.io/projected/7afb870a-75a4-42d5-9704-5cef14dd3ce9-kube-api-access-rw8bc\") pod \"keystone-operator-controller-manager-b4d948c87-jjbwt\" (UID: \"7afb870a-75a4-42d5-9704-5cef14dd3ce9\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.717672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.728946 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.756515 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.765738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c85cx\" (UniqueName: \"kubernetes.io/projected/0c8c62e9-0201-43a4-b823-82af87a0977e-kube-api-access-c85cx\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.769242 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.777302 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.777458 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.778810 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.779662 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s6mt\" (UniqueName: \"kubernetes.io/projected/a9979be5-6650-425b-a748-51e2cb552413-kube-api-access-4s6mt\") pod \"manila-operator-controller-manager-54f6768c69-6lvhz\" (UID: \"a9979be5-6650-425b-a748-51e2cb552413\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.779792 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzq8g\" (UniqueName: \"kubernetes.io/projected/6fdeab3e-de35-4d69-9e67-e5d8257bc25d-kube-api-access-rzq8g\") pod \"neutron-operator-controller-manager-64ddbf8bb-knwp9\" (UID: \"6fdeab3e-de35-4d69-9e67-e5d8257bc25d\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.779895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smwnb\" (UniqueName: \"kubernetes.io/projected/f8f1dca9-8b83-469d-b834-3f11376576c9-kube-api-access-smwnb\") pod \"mariadb-operator-controller-manager-6994f66f48-wn92v\" (UID: \"f8f1dca9-8b83-469d-b834-3f11376576c9\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.787158 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-5xxdq" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.787411 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-knql5" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.788079 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.797437 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.810434 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.811554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.817883 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-gb5fm" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.818994 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.822189 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.825471 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-khxcn" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.827331 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.828333 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.834064 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.844015 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8d99h" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.845982 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.876674 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883646 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s6mt\" (UniqueName: \"kubernetes.io/projected/a9979be5-6650-425b-a748-51e2cb552413-kube-api-access-4s6mt\") pod \"manila-operator-controller-manager-54f6768c69-6lvhz\" (UID: \"a9979be5-6650-425b-a748-51e2cb552413\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883724 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfbq\" (UniqueName: \"kubernetes.io/projected/34cb38e0-7c0a-4f00-89e9-9be7b394585d-kube-api-access-wlfbq\") pod \"placement-operator-controller-manager-8497b45c89-x4gh4\" (UID: \"34cb38e0-7c0a-4f00-89e9-9be7b394585d\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883772 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cn9n\" (UniqueName: \"kubernetes.io/projected/72728d52-a8e9-4689-8da0-871f250f7664-kube-api-access-8cn9n\") pod \"ovn-operator-controller-manager-d44cf6b75-nlq5k\" (UID: \"72728d52-a8e9-4689-8da0-871f250f7664\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzq8g\" (UniqueName: \"kubernetes.io/projected/6fdeab3e-de35-4d69-9e67-e5d8257bc25d-kube-api-access-rzq8g\") pod \"neutron-operator-controller-manager-64ddbf8bb-knwp9\" (UID: \"6fdeab3e-de35-4d69-9e67-e5d8257bc25d\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7dkp\" (UniqueName: \"kubernetes.io/projected/f8de466d-f069-4a8e-8598-72a163525c24-kube-api-access-x7dkp\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883841 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smwnb\" (UniqueName: \"kubernetes.io/projected/f8f1dca9-8b83-469d-b834-3f11376576c9-kube-api-access-smwnb\") pod \"mariadb-operator-controller-manager-6994f66f48-wn92v\" (UID: \"f8f1dca9-8b83-469d-b834-3f11376576c9\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883861 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g57r5\" (UniqueName: \"kubernetes.io/projected/14efe385-5147-49ed-a42f-804b91438a55-kube-api-access-g57r5\") pod \"octavia-operator-controller-manager-69f8888797-ln4sp\" (UID: \"14efe385-5147-49ed-a42f-804b91438a55\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.883895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.884802 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5t22\" (UniqueName: \"kubernetes.io/projected/4e2614ed-ea7a-430e-af7b-4d66f05f7b96-kube-api-access-c5t22\") pod \"nova-operator-controller-manager-567668f5cf-d2clq\" (UID: \"4e2614ed-ea7a-430e-af7b-4d66f05f7b96\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.885388 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.885607 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.911009 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.914243 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qn79d" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.924017 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.930768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s6mt\" (UniqueName: \"kubernetes.io/projected/a9979be5-6650-425b-a748-51e2cb552413-kube-api-access-4s6mt\") pod \"manila-operator-controller-manager-54f6768c69-6lvhz\" (UID: \"a9979be5-6650-425b-a748-51e2cb552413\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.930796 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smwnb\" (UniqueName: \"kubernetes.io/projected/f8f1dca9-8b83-469d-b834-3f11376576c9-kube-api-access-smwnb\") pod \"mariadb-operator-controller-manager-6994f66f48-wn92v\" (UID: \"f8f1dca9-8b83-469d-b834-3f11376576c9\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.938312 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzq8g\" (UniqueName: \"kubernetes.io/projected/6fdeab3e-de35-4d69-9e67-e5d8257bc25d-kube-api-access-rzq8g\") pod \"neutron-operator-controller-manager-64ddbf8bb-knwp9\" (UID: \"6fdeab3e-de35-4d69-9e67-e5d8257bc25d\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.941114 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.978249 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.985655 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf"] Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986696 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jsp6\" (UniqueName: \"kubernetes.io/projected/4a325f02-ddda-49e9-9ef0-40fd4726b09f-kube-api-access-7jsp6\") pod \"swift-operator-controller-manager-68f46476f-9pxbg\" (UID: \"4a325f02-ddda-49e9-9ef0-40fd4726b09f\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986736 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g57r5\" (UniqueName: \"kubernetes.io/projected/14efe385-5147-49ed-a42f-804b91438a55-kube-api-access-g57r5\") pod \"octavia-operator-controller-manager-69f8888797-ln4sp\" (UID: \"14efe385-5147-49ed-a42f-804b91438a55\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986777 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986802 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5t22\" (UniqueName: \"kubernetes.io/projected/4e2614ed-ea7a-430e-af7b-4d66f05f7b96-kube-api-access-c5t22\") pod \"nova-operator-controller-manager-567668f5cf-d2clq\" (UID: \"4e2614ed-ea7a-430e-af7b-4d66f05f7b96\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986853 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfbq\" (UniqueName: \"kubernetes.io/projected/34cb38e0-7c0a-4f00-89e9-9be7b394585d-kube-api-access-wlfbq\") pod \"placement-operator-controller-manager-8497b45c89-x4gh4\" (UID: \"34cb38e0-7c0a-4f00-89e9-9be7b394585d\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986901 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cn9n\" (UniqueName: \"kubernetes.io/projected/72728d52-a8e9-4689-8da0-871f250f7664-kube-api-access-8cn9n\") pod \"ovn-operator-controller-manager-d44cf6b75-nlq5k\" (UID: \"72728d52-a8e9-4689-8da0-871f250f7664\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:11 crc kubenswrapper[4962]: I0220 10:11:11.986944 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7dkp\" (UniqueName: \"kubernetes.io/projected/f8de466d-f069-4a8e-8598-72a163525c24-kube-api-access-x7dkp\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.987235 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:11 crc kubenswrapper[4962]: E0220 10:11:11.987295 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:12.487279909 +0000 UTC m=+964.069751755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.000798 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.002112 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.005696 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dck5b" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.006969 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.012270 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfbq\" (UniqueName: \"kubernetes.io/projected/34cb38e0-7c0a-4f00-89e9-9be7b394585d-kube-api-access-wlfbq\") pod \"placement-operator-controller-manager-8497b45c89-x4gh4\" (UID: \"34cb38e0-7c0a-4f00-89e9-9be7b394585d\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.020555 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5t22\" (UniqueName: \"kubernetes.io/projected/4e2614ed-ea7a-430e-af7b-4d66f05f7b96-kube-api-access-c5t22\") pod \"nova-operator-controller-manager-567668f5cf-d2clq\" (UID: \"4e2614ed-ea7a-430e-af7b-4d66f05f7b96\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.026102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g57r5\" (UniqueName: \"kubernetes.io/projected/14efe385-5147-49ed-a42f-804b91438a55-kube-api-access-g57r5\") pod \"octavia-operator-controller-manager-69f8888797-ln4sp\" (UID: \"14efe385-5147-49ed-a42f-804b91438a55\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.026796 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7dkp\" (UniqueName: \"kubernetes.io/projected/f8de466d-f069-4a8e-8598-72a163525c24-kube-api-access-x7dkp\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.026887 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-lxl4x"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.028747 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.034700 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-kszqw" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.037722 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-lxl4x"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.060937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cn9n\" (UniqueName: \"kubernetes.io/projected/72728d52-a8e9-4689-8da0-871f250f7664-kube-api-access-8cn9n\") pod \"ovn-operator-controller-manager-d44cf6b75-nlq5k\" (UID: \"72728d52-a8e9-4689-8da0-871f250f7664\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.073650 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.074459 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.087951 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jsp6\" (UniqueName: \"kubernetes.io/projected/4a325f02-ddda-49e9-9ef0-40fd4726b09f-kube-api-access-7jsp6\") pod \"swift-operator-controller-manager-68f46476f-9pxbg\" (UID: \"4a325f02-ddda-49e9-9ef0-40fd4726b09f\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.088122 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p72nh\" (UniqueName: \"kubernetes.io/projected/32d42cbd-4ea1-49cc-b9d4-33fe5f655a16-kube-api-access-p72nh\") pod \"test-operator-controller-manager-7866795846-lxl4x\" (UID: \"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16\") " pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.088297 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpn7z\" (UniqueName: \"kubernetes.io/projected/7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084-kube-api-access-bpn7z\") pod \"telemetry-operator-controller-manager-7f45b4ff68-mfpm9\" (UID: \"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.097795 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.113865 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.114388 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jsp6\" (UniqueName: \"kubernetes.io/projected/4a325f02-ddda-49e9-9ef0-40fd4726b09f-kube-api-access-7jsp6\") pod \"swift-operator-controller-manager-68f46476f-9pxbg\" (UID: \"4a325f02-ddda-49e9-9ef0-40fd4726b09f\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.119155 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.125359 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.131730 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5rxms" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.145923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.157179 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.167451 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.195709 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.199382 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpn7z\" (UniqueName: \"kubernetes.io/projected/7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084-kube-api-access-bpn7z\") pod \"telemetry-operator-controller-manager-7f45b4ff68-mfpm9\" (UID: \"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.199541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.199721 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjxdt\" (UniqueName: \"kubernetes.io/projected/4c8bff11-1a85-4f9b-8fb2-defd04ac22d1-kube-api-access-bjxdt\") pod \"watcher-operator-controller-manager-5db88f68c-kthxs\" (UID: \"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.199780 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p72nh\" (UniqueName: \"kubernetes.io/projected/32d42cbd-4ea1-49cc-b9d4-33fe5f655a16-kube-api-access-p72nh\") pod \"test-operator-controller-manager-7866795846-lxl4x\" (UID: \"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16\") " pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.200228 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.200354 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:13.200315598 +0000 UTC m=+964.782787444 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.210576 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.214039 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.220318 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p72nh\" (UniqueName: \"kubernetes.io/projected/32d42cbd-4ea1-49cc-b9d4-33fe5f655a16-kube-api-access-p72nh\") pod \"test-operator-controller-manager-7866795846-lxl4x\" (UID: \"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16\") " pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.221373 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.221434 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.222246 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nl4z6" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.227795 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpn7z\" (UniqueName: \"kubernetes.io/projected/7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084-kube-api-access-bpn7z\") pod \"telemetry-operator-controller-manager-7f45b4ff68-mfpm9\" (UID: \"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.235998 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.273572 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.274652 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv"] Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.274808 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.289574 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-785xk" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.290144 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.306385 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjxdt\" (UniqueName: \"kubernetes.io/projected/4c8bff11-1a85-4f9b-8fb2-defd04ac22d1-kube-api-access-bjxdt\") pod \"watcher-operator-controller-manager-5db88f68c-kthxs\" (UID: \"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.306512 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.306564 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgfhj\" (UniqueName: \"kubernetes.io/projected/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-kube-api-access-qgfhj\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.306775 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.318554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.337821 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjxdt\" (UniqueName: \"kubernetes.io/projected/4c8bff11-1a85-4f9b-8fb2-defd04ac22d1-kube-api-access-bjxdt\") pod \"watcher-operator-controller-manager-5db88f68c-kthxs\" (UID: \"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.338219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.363742 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.401350 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.408110 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g586b\" (UniqueName: \"kubernetes.io/projected/5691d6ef-dedb-4a46-a1b6-0435e9f6db0a-kube-api-access-g586b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5mrjv\" (UID: \"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.408186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.408212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgfhj\" (UniqueName: \"kubernetes.io/projected/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-kube-api-access-qgfhj\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.408267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.410754 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.410841 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:12.910819798 +0000 UTC m=+964.493291644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.410921 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.410947 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:12.910940662 +0000 UTC m=+964.493412508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.433463 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgfhj\" (UniqueName: \"kubernetes.io/projected/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-kube-api-access-qgfhj\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.510053 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g586b\" (UniqueName: \"kubernetes.io/projected/5691d6ef-dedb-4a46-a1b6-0435e9f6db0a-kube-api-access-g586b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5mrjv\" (UID: \"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.510112 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.511303 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.511353 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:13.511334294 +0000 UTC m=+965.093806150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.534580 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g586b\" (UniqueName: \"kubernetes.io/projected/5691d6ef-dedb-4a46-a1b6-0435e9f6db0a-kube-api-access-g586b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5mrjv\" (UID: \"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.691995 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.921765 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.921864 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.922047 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.922124 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:13.922097383 +0000 UTC m=+965.504569229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.922677 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: E0220 10:11:12.922710 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:13.922702162 +0000 UTC m=+965.505174008 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.930886 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" event={"ID":"e0560856-ed00-4ea8-8ce7-a801f1d46489","Type":"ContainerStarted","Data":"de2121809761692983b3cd6a346cd8b6ff6b8bfe36194b1bba368bf6b4f129b6"} Feb 20 10:11:12 crc kubenswrapper[4962]: I0220 10:11:12.938988 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.234196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.234450 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.234568 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:15.234543174 +0000 UTC m=+966.817015020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.416433 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.422053 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4"] Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.438493 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee6970c_0ad7_46ea_ab75_dcb7d552ffbb.slice/crio-a198d009dc06fd2574b0341274512a78ce0a49b9ab0f9ecd050e235fde8d052a WatchSource:0}: Error finding container a198d009dc06fd2574b0341274512a78ce0a49b9ab0f9ecd050e235fde8d052a: Status 404 returned error can't find the container with id a198d009dc06fd2574b0341274512a78ce0a49b9ab0f9ecd050e235fde8d052a Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.442492 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.452404 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.461673 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.470196 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7"] Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.471110 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e2614ed_ea7a_430e_af7b_4d66f05f7b96.slice/crio-34a0ddda37deba07bfa7c2eadf86ddb51fdf0e7845e15d4a1f3606dc9e698f97 WatchSource:0}: Error finding container 34a0ddda37deba07bfa7c2eadf86ddb51fdf0e7845e15d4a1f3606dc9e698f97: Status 404 returned error can't find the container with id 34a0ddda37deba07bfa7c2eadf86ddb51fdf0e7845e15d4a1f3606dc9e698f97 Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.472952 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12f33757_f329_47a6_9273_bdeb1558a4d7.slice/crio-fadd321de794e991e652070b984f3e3492dc4841091abdab10691e3fb7b80e21 WatchSource:0}: Error finding container fadd321de794e991e652070b984f3e3492dc4841091abdab10691e3fb7b80e21: Status 404 returned error can't find the container with id fadd321de794e991e652070b984f3e3492dc4841091abdab10691e3fb7b80e21 Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.543069 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.543939 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.544051 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:15.544019394 +0000 UTC m=+967.126491240 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.619560 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.629911 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz"] Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.658009 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cb38e0_7c0a_4f00_89e9_9be7b394585d.slice/crio-06d9c97aac8445abca447e2451adb01f4d52ca1060ccd8903d705d5a5b10a597 WatchSource:0}: Error finding container 06d9c97aac8445abca447e2451adb01f4d52ca1060ccd8903d705d5a5b10a597: Status 404 returned error can't find the container with id 06d9c97aac8445abca447e2451adb01f4d52ca1060ccd8903d705d5a5b10a597 Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.661957 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9979be5_6650_425b_a748_51e2cb552413.slice/crio-820f77e77b7c7eaaf13d56d1af4e62815ea5b2f8611fa0b67213ae0ef2f50252 WatchSource:0}: Error finding container 820f77e77b7c7eaaf13d56d1af4e62815ea5b2f8611fa0b67213ae0ef2f50252: Status 404 returned error can't find the container with id 820f77e77b7c7eaaf13d56d1af4e62815ea5b2f8611fa0b67213ae0ef2f50252 Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.826250 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.842638 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v"] Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.847146 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt"] Feb 20 10:11:13 crc kubenswrapper[4962]: W0220 10:11:13.857761 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f1dca9_8b83_469d_b834_3f11376576c9.slice/crio-aae002b2366c0e5cb3f0062fc257979c206da0e6321811b91a215ffa06672dd2 WatchSource:0}: Error finding container aae002b2366c0e5cb3f0062fc257979c206da0e6321811b91a215ffa06672dd2: Status 404 returned error can't find the container with id aae002b2366c0e5cb3f0062fc257979c206da0e6321811b91a215ffa06672dd2 Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.953914 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" event={"ID":"cf0e10ba-c175-44c3-9011-6646f21ba334","Type":"ContainerStarted","Data":"6234a297a899024702f26c5ce04c827d4d29f60d147c6493842e9dfd573eec3d"} Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.959568 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.959804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.959885 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" event={"ID":"ac33f7ed-c3f8-487d-89dc-4a614d357b86","Type":"ContainerStarted","Data":"f6b8e8d13e2962c4b74341b6b7c64c5cabeb00aa2859bb8701a3cfd8464d79d5"} Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.960066 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.960139 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:15.960119736 +0000 UTC m=+967.542591572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.960151 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: E0220 10:11:13.960210 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:15.960190078 +0000 UTC m=+967.542661924 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.963091 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" event={"ID":"ea986843-26e4-4410-a65e-ae51c02dc04c","Type":"ContainerStarted","Data":"b0b1de95d5af32b9a4d58731192b6b17db0019a768e764e75f0de147ab865dc4"} Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.965291 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" event={"ID":"5fec06f1-8ccf-403c-88de-2b581f056802","Type":"ContainerStarted","Data":"67d77cc22a959c91c108f2f6a7c834e9424ac410177673a61e232f10a3a001b2"} Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.970031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" event={"ID":"f8f1dca9-8b83-469d-b834-3f11376576c9","Type":"ContainerStarted","Data":"aae002b2366c0e5cb3f0062fc257979c206da0e6321811b91a215ffa06672dd2"} Feb 20 10:11:13 crc kubenswrapper[4962]: I0220 10:11:13.993934 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" event={"ID":"7afb870a-75a4-42d5-9704-5cef14dd3ce9","Type":"ContainerStarted","Data":"d4d22f66bafe4ea03724a1734229c454926f1f805e14f6c6b482418891ded9f8"} Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.000523 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" event={"ID":"4e2614ed-ea7a-430e-af7b-4d66f05f7b96","Type":"ContainerStarted","Data":"34a0ddda37deba07bfa7c2eadf86ddb51fdf0e7845e15d4a1f3606dc9e698f97"} Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.001641 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" event={"ID":"34cb38e0-7c0a-4f00-89e9-9be7b394585d","Type":"ContainerStarted","Data":"06d9c97aac8445abca447e2451adb01f4d52ca1060ccd8903d705d5a5b10a597"} Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.003322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" event={"ID":"a9979be5-6650-425b-a748-51e2cb552413","Type":"ContainerStarted","Data":"820f77e77b7c7eaaf13d56d1af4e62815ea5b2f8611fa0b67213ae0ef2f50252"} Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.026130 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv"] Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.035925 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp"] Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.038781 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" event={"ID":"12f33757-f329-47a6-9273-bdeb1558a4d7","Type":"ContainerStarted","Data":"fadd321de794e991e652070b984f3e3492dc4841091abdab10691e3fb7b80e21"} Feb 20 10:11:14 crc kubenswrapper[4962]: W0220 10:11:14.044385 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d077bc6_8a1e_426a_9b2d_8e6b2a5eb084.slice/crio-f4efa0c9ab050b7e1f9103c61606c2e72de3fc0893312327f09c91267efd3b79 WatchSource:0}: Error finding container f4efa0c9ab050b7e1f9103c61606c2e72de3fc0893312327f09c91267efd3b79: Status 404 returned error can't find the container with id f4efa0c9ab050b7e1f9103c61606c2e72de3fc0893312327f09c91267efd3b79 Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.051647 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" event={"ID":"6fdeab3e-de35-4d69-9e67-e5d8257bc25d","Type":"ContainerStarted","Data":"6652b8540b54ec3d0190a09e12606a00ef4a19eb8a27c4440400287cf40c8aeb"} Feb 20 10:11:14 crc kubenswrapper[4962]: W0220 10:11:14.054646 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72728d52_a8e9_4689_8da0_871f250f7664.slice/crio-f3eb52c2dde70e86b51a7691452a3f33e1785c97aa6f9a48a63171874655361d WatchSource:0}: Error finding container f3eb52c2dde70e86b51a7691452a3f33e1785c97aa6f9a48a63171874655361d: Status 404 returned error can't find the container with id f3eb52c2dde70e86b51a7691452a3f33e1785c97aa6f9a48a63171874655361d Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.054782 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9"] Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.057491 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" event={"ID":"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb","Type":"ContainerStarted","Data":"a198d009dc06fd2574b0341274512a78ce0a49b9ab0f9ecd050e235fde8d052a"} Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.062428 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g57r5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-ln4sp_openstack-operators(14efe385-5147-49ed-a42f-804b91438a55): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.062581 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k"] Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.063549 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" podUID="14efe385-5147-49ed-a42f-804b91438a55" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.065790 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jsp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-9pxbg_openstack-operators(4a325f02-ddda-49e9-9ef0-40fd4726b09f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.065969 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g586b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5mrjv_openstack-operators(5691d6ef-dedb-4a46-a1b6-0435e9f6db0a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.067110 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" podUID="5691d6ef-dedb-4a46-a1b6-0435e9f6db0a" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.067164 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podUID="4a325f02-ddda-49e9-9ef0-40fd4726b09f" Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.071158 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs"] Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.075739 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p72nh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-lxl4x_openstack-operators(32d42cbd-4ea1-49cc-b9d4-33fe5f655a16): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.077495 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" podUID="32d42cbd-4ea1-49cc-b9d4-33fe5f655a16" Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.078360 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-lxl4x"] Feb 20 10:11:14 crc kubenswrapper[4962]: W0220 10:11:14.082074 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8bff11_1a85_4f9b_8fb2_defd04ac22d1.slice/crio-dbec4d7343fe353d6b3092d5154c5793b1670d4c354758151623388b677647ca WatchSource:0}: Error finding container dbec4d7343fe353d6b3092d5154c5793b1670d4c354758151623388b677647ca: Status 404 returned error can't find the container with id dbec4d7343fe353d6b3092d5154c5793b1670d4c354758151623388b677647ca Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.085512 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bjxdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-kthxs_openstack-operators(4c8bff11-1a85-4f9b-8fb2-defd04ac22d1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 10:11:14 crc kubenswrapper[4962]: E0220 10:11:14.086745 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" podUID="4c8bff11-1a85-4f9b-8fb2-defd04ac22d1" Feb 20 10:11:14 crc kubenswrapper[4962]: I0220 10:11:14.093255 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg"] Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.068715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" event={"ID":"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084","Type":"ContainerStarted","Data":"f4efa0c9ab050b7e1f9103c61606c2e72de3fc0893312327f09c91267efd3b79"} Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.071757 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" event={"ID":"72728d52-a8e9-4689-8da0-871f250f7664","Type":"ContainerStarted","Data":"f3eb52c2dde70e86b51a7691452a3f33e1785c97aa6f9a48a63171874655361d"} Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.074957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" event={"ID":"14efe385-5147-49ed-a42f-804b91438a55","Type":"ContainerStarted","Data":"78af8ed74838ca01a42875f1e67f992fae04fded77b2a1aa0226f965681dca04"} Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.081078 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" event={"ID":"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a","Type":"ContainerStarted","Data":"40dd1a45cd28ca81348e73822285443e43967275c254cf1ffeeb5d0fb1350c2b"} Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.082342 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" podUID="14efe385-5147-49ed-a42f-804b91438a55" Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.083150 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" podUID="5691d6ef-dedb-4a46-a1b6-0435e9f6db0a" Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.148024 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podUID="4a325f02-ddda-49e9-9ef0-40fd4726b09f" Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.149099 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" event={"ID":"4a325f02-ddda-49e9-9ef0-40fd4726b09f","Type":"ContainerStarted","Data":"cbee5d17816c8b4621117d17f860f5fa123589b5fbd03ea826686f8ebd2a55a4"} Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.149246 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" event={"ID":"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16","Type":"ContainerStarted","Data":"b976381d23336d023db12c345f3987af523011b25e1cd393ab8dcdad4bb365dc"} Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.150637 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" podUID="32d42cbd-4ea1-49cc-b9d4-33fe5f655a16" Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.150864 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" event={"ID":"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1","Type":"ContainerStarted","Data":"dbec4d7343fe353d6b3092d5154c5793b1670d4c354758151623388b677647ca"} Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.156971 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" podUID="4c8bff11-1a85-4f9b-8fb2-defd04ac22d1" Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.285517 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.285758 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.285849 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:19.285824401 +0000 UTC m=+970.868296247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:15 crc kubenswrapper[4962]: I0220 10:11:15.591031 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.591305 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:15 crc kubenswrapper[4962]: E0220 10:11:15.591423 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:19.591392742 +0000 UTC m=+971.173864588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: I0220 10:11:16.005051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.005297 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: I0220 10:11:16.005670 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.005795 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:20.00573064 +0000 UTC m=+971.588202486 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.005904 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.006312 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:20.006282977 +0000 UTC m=+971.588754823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.166174 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" podUID="32d42cbd-4ea1-49cc-b9d4-33fe5f655a16" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.166994 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" podUID="5691d6ef-dedb-4a46-a1b6-0435e9f6db0a" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.167042 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" podUID="14efe385-5147-49ed-a42f-804b91438a55" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.167096 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" podUID="4c8bff11-1a85-4f9b-8fb2-defd04ac22d1" Feb 20 10:11:16 crc kubenswrapper[4962]: E0220 10:11:16.167358 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podUID="4a325f02-ddda-49e9-9ef0-40fd4726b09f" Feb 20 10:11:17 crc kubenswrapper[4962]: E0220 10:11:17.171945 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podUID="4a325f02-ddda-49e9-9ef0-40fd4726b09f" Feb 20 10:11:19 crc kubenswrapper[4962]: I0220 10:11:19.369248 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:19 crc kubenswrapper[4962]: E0220 10:11:19.369455 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:19 crc kubenswrapper[4962]: E0220 10:11:19.369979 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:27.369952614 +0000 UTC m=+978.952424470 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:19 crc kubenswrapper[4962]: I0220 10:11:19.674024 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:19 crc kubenswrapper[4962]: E0220 10:11:19.674300 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:19 crc kubenswrapper[4962]: E0220 10:11:19.674418 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:27.674388839 +0000 UTC m=+979.256860685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:20 crc kubenswrapper[4962]: I0220 10:11:20.080706 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:20 crc kubenswrapper[4962]: I0220 10:11:20.080812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:20 crc kubenswrapper[4962]: E0220 10:11:20.081041 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:20 crc kubenswrapper[4962]: E0220 10:11:20.081110 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:28.081087724 +0000 UTC m=+979.663559570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:20 crc kubenswrapper[4962]: E0220 10:11:20.081421 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:20 crc kubenswrapper[4962]: E0220 10:11:20.081547 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:28.081513996 +0000 UTC m=+979.663985842 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:26 crc kubenswrapper[4962]: E0220 10:11:26.603354 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 20 10:11:26 crc kubenswrapper[4962]: E0220 10:11:26.604318 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c5t22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-d2clq_openstack-operators(4e2614ed-ea7a-430e-af7b-4d66f05f7b96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:11:26 crc kubenswrapper[4962]: E0220 10:11:26.605654 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" podUID="4e2614ed-ea7a-430e-af7b-4d66f05f7b96" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.249203 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" event={"ID":"e0560856-ed00-4ea8-8ce7-a801f1d46489","Type":"ContainerStarted","Data":"9587b4cc915bd223ced4ad688e359bc596394dd68782d13782774cb8edc40f9d"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.250080 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.252241 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" event={"ID":"ac33f7ed-c3f8-487d-89dc-4a614d357b86","Type":"ContainerStarted","Data":"ab8ac08a352ba7c52306132a961dc005f5d97d4b5ad01d781149f0cc6c03268f"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.253095 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.255358 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" event={"ID":"72728d52-a8e9-4689-8da0-871f250f7664","Type":"ContainerStarted","Data":"e12e05b18241c328b34665be7687a1cff988c8b8df28f1f75e8e73dc0a32bda6"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.255872 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.259444 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" event={"ID":"6fdeab3e-de35-4d69-9e67-e5d8257bc25d","Type":"ContainerStarted","Data":"d0379e1295a0d61c9b37a457180f1a8d19d6c9ccca5b1538c1481616858f6822"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.259960 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.261577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" event={"ID":"cf0e10ba-c175-44c3-9011-6646f21ba334","Type":"ContainerStarted","Data":"df4608f5e8ec2e972b257e396da911f7ff0e311d7a83103d4787622777e147ef"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.262056 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.263952 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" event={"ID":"f8f1dca9-8b83-469d-b834-3f11376576c9","Type":"ContainerStarted","Data":"de8d7708d5d0fa3791692499a6e7bd93588a0fd28671b624f22bb642a0161890"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.264406 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.266411 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" event={"ID":"34cb38e0-7c0a-4f00-89e9-9be7b394585d","Type":"ContainerStarted","Data":"0b6b01c300734b885eb40f42c6585da05685005a9c4200b81f1a5fa1b8247e48"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.267130 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.268873 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" event={"ID":"12f33757-f329-47a6-9273-bdeb1558a4d7","Type":"ContainerStarted","Data":"05c962eff274c8bae15b00fd2f211bcb968c23ce0185a4df9d19741cd32e5916"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.269289 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.270677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" event={"ID":"7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084","Type":"ContainerStarted","Data":"3d08eff0b78e216b6186846e28e49908982695e75945070513ae6a45211b316f"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.271086 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.272400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" event={"ID":"7afb870a-75a4-42d5-9704-5cef14dd3ce9","Type":"ContainerStarted","Data":"715bd46432611a2a2686bdeb2c0fe1cbfa0ddd178c6a1f4c23686ad07b8ab806"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.272844 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.274259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" event={"ID":"a9979be5-6650-425b-a748-51e2cb552413","Type":"ContainerStarted","Data":"93ab62be7513adf352b5e3d097b0ce58852ed7f79e8e5275cc808a99e66e3142"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.274771 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.276277 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" event={"ID":"5fec06f1-8ccf-403c-88de-2b581f056802","Type":"ContainerStarted","Data":"00f242ee1435c337aa5cefd9c22a505457bb5e46910e32802ecbd403a383ad94"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.276741 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.278219 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" event={"ID":"ea986843-26e4-4410-a65e-ae51c02dc04c","Type":"ContainerStarted","Data":"6f73ba4a8bd7cf78757487549e7c5547b7b7ba69a7c509f3697a9f9423047d0e"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.278681 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.280369 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" event={"ID":"fee6970c-0ad7-46ea-ab75-dcb7d552ffbb","Type":"ContainerStarted","Data":"701ff4fddbcab9a3e85b0ffcc55f73ae223df49b5e1484e6866fba8f1a7a60d2"} Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.280455 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.282122 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" podUID="4e2614ed-ea7a-430e-af7b-4d66f05f7b96" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.301841 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" podStartSLOduration=2.353980298 podStartE2EDuration="16.301818705s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:12.078943103 +0000 UTC m=+963.661414949" lastFinishedPulling="2026-02-20 10:11:26.02678151 +0000 UTC m=+977.609253356" observedRunningTime="2026-02-20 10:11:27.294835578 +0000 UTC m=+978.877307424" watchObservedRunningTime="2026-02-20 10:11:27.301818705 +0000 UTC m=+978.884290551" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.359301 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" podStartSLOduration=3.823041464 podStartE2EDuration="16.359281055s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.060870589 +0000 UTC m=+965.643342435" lastFinishedPulling="2026-02-20 10:11:26.59711018 +0000 UTC m=+978.179582026" observedRunningTime="2026-02-20 10:11:27.325326297 +0000 UTC m=+978.907798143" watchObservedRunningTime="2026-02-20 10:11:27.359281055 +0000 UTC m=+978.941752901" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.417121 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.418796 4962 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.418859 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert podName:0c8c62e9-0201-43a4-b823-82af87a0977e nodeName:}" failed. No retries permitted until 2026-02-20 10:11:43.41884288 +0000 UTC m=+995.001314726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert") pod "infra-operator-controller-manager-79d975b745-4rnhn" (UID: "0c8c62e9-0201-43a4-b823-82af87a0977e") : secret "infra-operator-webhook-server-cert" not found Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.433163 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" podStartSLOduration=3.323452742 podStartE2EDuration="16.433145704s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.457512997 +0000 UTC m=+965.039984843" lastFinishedPulling="2026-02-20 10:11:26.567205959 +0000 UTC m=+978.149677805" observedRunningTime="2026-02-20 10:11:27.429801641 +0000 UTC m=+979.012273487" watchObservedRunningTime="2026-02-20 10:11:27.433145704 +0000 UTC m=+979.015617550" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.433293 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" podStartSLOduration=3.502887066 podStartE2EDuration="16.433288749s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.096229422 +0000 UTC m=+964.678701268" lastFinishedPulling="2026-02-20 10:11:26.026631105 +0000 UTC m=+977.609102951" observedRunningTime="2026-02-20 10:11:27.374421706 +0000 UTC m=+978.956893562" watchObservedRunningTime="2026-02-20 10:11:27.433288749 +0000 UTC m=+979.015760595" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.571903 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" podStartSLOduration=3.422013097 podStartE2EDuration="16.571887165s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.465099419 +0000 UTC m=+965.047571265" lastFinishedPulling="2026-02-20 10:11:26.614973497 +0000 UTC m=+978.197445333" observedRunningTime="2026-02-20 10:11:27.5707768 +0000 UTC m=+979.153248646" watchObservedRunningTime="2026-02-20 10:11:27.571887165 +0000 UTC m=+979.154359011" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.574896 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" podStartSLOduration=3.997273064 podStartE2EDuration="16.574888269s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.051565864 +0000 UTC m=+965.634037710" lastFinishedPulling="2026-02-20 10:11:26.629181059 +0000 UTC m=+978.211652915" observedRunningTime="2026-02-20 10:11:27.505630322 +0000 UTC m=+979.088102168" watchObservedRunningTime="2026-02-20 10:11:27.574888269 +0000 UTC m=+979.157360115" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.613384 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" podStartSLOduration=4.057935608 podStartE2EDuration="16.613366507s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.471184665 +0000 UTC m=+965.053656511" lastFinishedPulling="2026-02-20 10:11:26.026615574 +0000 UTC m=+977.609087410" observedRunningTime="2026-02-20 10:11:27.609434364 +0000 UTC m=+979.191906210" watchObservedRunningTime="2026-02-20 10:11:27.613366507 +0000 UTC m=+979.195838353" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.649933 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" podStartSLOduration=3.892971619 podStartE2EDuration="16.649916335s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.861784407 +0000 UTC m=+965.444256253" lastFinishedPulling="2026-02-20 10:11:26.618729123 +0000 UTC m=+978.201200969" observedRunningTime="2026-02-20 10:11:27.647957864 +0000 UTC m=+979.230429710" watchObservedRunningTime="2026-02-20 10:11:27.649916335 +0000 UTC m=+979.232388181" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.685350 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" podStartSLOduration=3.807338585 podStartE2EDuration="16.685332528s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.689682331 +0000 UTC m=+965.272154177" lastFinishedPulling="2026-02-20 10:11:26.567676274 +0000 UTC m=+978.150148120" observedRunningTime="2026-02-20 10:11:27.673532761 +0000 UTC m=+979.256004607" watchObservedRunningTime="2026-02-20 10:11:27.685332528 +0000 UTC m=+979.267804374" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.720369 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.720853 4962 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:27 crc kubenswrapper[4962]: E0220 10:11:27.720899 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert podName:f8de466d-f069-4a8e-8598-72a163525c24 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:43.720886706 +0000 UTC m=+995.303358552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" (UID: "f8de466d-f069-4a8e-8598-72a163525c24") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.772317 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" podStartSLOduration=4.0127478 podStartE2EDuration="16.772299066s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.859118935 +0000 UTC m=+965.441590781" lastFinishedPulling="2026-02-20 10:11:26.618670201 +0000 UTC m=+978.201142047" observedRunningTime="2026-02-20 10:11:27.731920849 +0000 UTC m=+979.314392695" watchObservedRunningTime="2026-02-20 10:11:27.772299066 +0000 UTC m=+979.354770912" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.774040 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" podStartSLOduration=4.227947407 podStartE2EDuration="16.774032481s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.481119169 +0000 UTC m=+965.063591015" lastFinishedPulling="2026-02-20 10:11:26.027204243 +0000 UTC m=+977.609676089" observedRunningTime="2026-02-20 10:11:27.769166798 +0000 UTC m=+979.351638644" watchObservedRunningTime="2026-02-20 10:11:27.774032481 +0000 UTC m=+979.356504327" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.853116 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" podStartSLOduration=3.932087269 podStartE2EDuration="16.853099392s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.671469183 +0000 UTC m=+965.253941029" lastFinishedPulling="2026-02-20 10:11:26.592481306 +0000 UTC m=+978.174953152" observedRunningTime="2026-02-20 10:11:27.852326639 +0000 UTC m=+979.434798485" watchObservedRunningTime="2026-02-20 10:11:27.853099392 +0000 UTC m=+979.435571238" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.909850 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" podStartSLOduration=3.776749261 podStartE2EDuration="16.909834029s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.434258375 +0000 UTC m=+965.016730221" lastFinishedPulling="2026-02-20 10:11:26.567343133 +0000 UTC m=+978.149814989" observedRunningTime="2026-02-20 10:11:27.891483418 +0000 UTC m=+979.473955264" watchObservedRunningTime="2026-02-20 10:11:27.909834029 +0000 UTC m=+979.492305875" Feb 20 10:11:27 crc kubenswrapper[4962]: I0220 10:11:27.912039 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" podStartSLOduration=4.188378109 podStartE2EDuration="16.912032567s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.859074614 +0000 UTC m=+965.441546460" lastFinishedPulling="2026-02-20 10:11:26.582729082 +0000 UTC m=+978.165200918" observedRunningTime="2026-02-20 10:11:27.908778317 +0000 UTC m=+979.491250163" watchObservedRunningTime="2026-02-20 10:11:27.912032567 +0000 UTC m=+979.494504413" Feb 20 10:11:28 crc kubenswrapper[4962]: I0220 10:11:28.131612 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:28 crc kubenswrapper[4962]: I0220 10:11:28.131734 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:28 crc kubenswrapper[4962]: E0220 10:11:28.131839 4962 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 10:11:28 crc kubenswrapper[4962]: E0220 10:11:28.131875 4962 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 10:11:28 crc kubenswrapper[4962]: E0220 10:11:28.131922 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:44.131909685 +0000 UTC m=+995.714381531 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "metrics-server-cert" not found Feb 20 10:11:28 crc kubenswrapper[4962]: E0220 10:11:28.131939 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs podName:98bbcdbd-382d-48ca-aa14-3e9ba4b63c98 nodeName:}" failed. No retries permitted until 2026-02-20 10:11:44.131931126 +0000 UTC m=+995.714402962 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-rqmzz" (UID: "98bbcdbd-382d-48ca-aa14-3e9ba4b63c98") : secret "webhook-server-cert" not found Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.321435 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" event={"ID":"4c8bff11-1a85-4f9b-8fb2-defd04ac22d1","Type":"ContainerStarted","Data":"323f98ce1601de8894336c7bd61186e8d026d66d2bcdc26540de7ad29dbebbf7"} Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.329739 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.351354 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" podStartSLOduration=4.242454111 podStartE2EDuration="20.35132573s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.085383539 +0000 UTC m=+965.667855385" lastFinishedPulling="2026-02-20 10:11:30.194255168 +0000 UTC m=+981.776727004" observedRunningTime="2026-02-20 10:11:31.348571304 +0000 UTC m=+982.931043150" watchObservedRunningTime="2026-02-20 10:11:31.35132573 +0000 UTC m=+982.933797586" Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.735292 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-wcqzf" Feb 20 10:11:31 crc kubenswrapper[4962]: I0220 10:11:31.764181 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-75vx4" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.077730 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-wn92v" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.077820 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lvhz" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.143198 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-knwp9" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.200485 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-nlq5k" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.324548 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-x4gh4" Feb 20 10:11:32 crc kubenswrapper[4962]: I0220 10:11:32.346836 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-mfpm9" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.398043 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" event={"ID":"32d42cbd-4ea1-49cc-b9d4-33fe5f655a16","Type":"ContainerStarted","Data":"06733ac5121ca8b1af3011160578bd66b9ee3b24ddfcd017df1578d5ef88dfb5"} Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.399397 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.400223 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" event={"ID":"14efe385-5147-49ed-a42f-804b91438a55","Type":"ContainerStarted","Data":"7265950e2e793defd3421e8394e26811a3289118f85d0010c7dbb0bf0eb1e2c9"} Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.400482 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.402287 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" event={"ID":"5691d6ef-dedb-4a46-a1b6-0435e9f6db0a","Type":"ContainerStarted","Data":"8081ce6b5031a3b98b6c204d892981565ae720e05862b06dca6f686aa3e02ddb"} Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.404474 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" event={"ID":"4a325f02-ddda-49e9-9ef0-40fd4726b09f","Type":"ContainerStarted","Data":"193e901d7eb4559e021415a15936acae9c3fff27b7cd8d4e6cce3b63f9a46a95"} Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.404759 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.425288 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" podStartSLOduration=3.851730459 podStartE2EDuration="26.425268137s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.075520577 +0000 UTC m=+965.657992423" lastFinishedPulling="2026-02-20 10:11:36.649058235 +0000 UTC m=+988.231530101" observedRunningTime="2026-02-20 10:11:37.419449516 +0000 UTC m=+989.001921362" watchObservedRunningTime="2026-02-20 10:11:37.425268137 +0000 UTC m=+989.007739983" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.438946 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" podStartSLOduration=3.834117148 podStartE2EDuration="26.438925972s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.06219551 +0000 UTC m=+965.644667356" lastFinishedPulling="2026-02-20 10:11:36.667004334 +0000 UTC m=+988.249476180" observedRunningTime="2026-02-20 10:11:37.437642403 +0000 UTC m=+989.020114249" watchObservedRunningTime="2026-02-20 10:11:37.438925972 +0000 UTC m=+989.021397818" Feb 20 10:11:37 crc kubenswrapper[4962]: I0220 10:11:37.468078 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5mrjv" podStartSLOduration=2.917444821 podStartE2EDuration="25.468049719s" podCreationTimestamp="2026-02-20 10:11:12 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.065904643 +0000 UTC m=+965.648376479" lastFinishedPulling="2026-02-20 10:11:36.616509531 +0000 UTC m=+988.198981377" observedRunningTime="2026-02-20 10:11:37.459502263 +0000 UTC m=+989.041974129" watchObservedRunningTime="2026-02-20 10:11:37.468049719 +0000 UTC m=+989.050521565" Feb 20 10:11:38 crc kubenswrapper[4962]: I0220 10:11:38.175250 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" podStartSLOduration=4.625262262 podStartE2EDuration="27.175210371s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:14.065536581 +0000 UTC m=+965.648008417" lastFinishedPulling="2026-02-20 10:11:36.61548464 +0000 UTC m=+988.197956526" observedRunningTime="2026-02-20 10:11:37.485335598 +0000 UTC m=+989.067807464" watchObservedRunningTime="2026-02-20 10:11:38.175210371 +0000 UTC m=+989.757682257" Feb 20 10:11:39 crc kubenswrapper[4962]: I0220 10:11:39.426023 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" event={"ID":"4e2614ed-ea7a-430e-af7b-4d66f05f7b96","Type":"ContainerStarted","Data":"06a7b62b8e10b7a63bc55092190701acc90fd2479affaadec1a66e80ed258450"} Feb 20 10:11:39 crc kubenswrapper[4962]: I0220 10:11:39.426605 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:39 crc kubenswrapper[4962]: I0220 10:11:39.459054 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" podStartSLOduration=3.352429047 podStartE2EDuration="28.459038021s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:13.474681972 +0000 UTC m=+965.057153818" lastFinishedPulling="2026-02-20 10:11:38.581290936 +0000 UTC m=+990.163762792" observedRunningTime="2026-02-20 10:11:39.455981245 +0000 UTC m=+991.038453091" watchObservedRunningTime="2026-02-20 10:11:39.459038021 +0000 UTC m=+991.041509867" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.640040 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-nhpg5" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.713279 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bsq9n" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.715673 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-r2t72" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.850018 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rhhc7" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.888515 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-2hg4n" Feb 20 10:11:41 crc kubenswrapper[4962]: I0220 10:11:41.982052 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-jjbwt" Feb 20 10:11:42 crc kubenswrapper[4962]: I0220 10:11:42.149243 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ln4sp" Feb 20 10:11:42 crc kubenswrapper[4962]: I0220 10:11:42.323019 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-9pxbg" Feb 20 10:11:42 crc kubenswrapper[4962]: I0220 10:11:42.383221 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-kthxs" Feb 20 10:11:42 crc kubenswrapper[4962]: I0220 10:11:42.410142 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-lxl4x" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.514443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.522698 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c8c62e9-0201-43a4-b823-82af87a0977e-cert\") pod \"infra-operator-controller-manager-79d975b745-4rnhn\" (UID: \"0c8c62e9-0201-43a4-b823-82af87a0977e\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.660695 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.827290 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.836083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8de466d-f069-4a8e-8598-72a163525c24-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf\" (UID: \"f8de466d-f069-4a8e-8598-72a163525c24\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:43 crc kubenswrapper[4962]: I0220 10:11:43.929869 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn"] Feb 20 10:11:43 crc kubenswrapper[4962]: W0220 10:11:43.935347 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c8c62e9_0201_43a4_b823_82af87a0977e.slice/crio-d6cae2c92f4d956792e194551d455fbc10f25f338829b86088953eab0f7871e9 WatchSource:0}: Error finding container d6cae2c92f4d956792e194551d455fbc10f25f338829b86088953eab0f7871e9: Status 404 returned error can't find the container with id d6cae2c92f4d956792e194551d455fbc10f25f338829b86088953eab0f7871e9 Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.030923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.133982 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.136992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.143066 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.144148 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98bbcdbd-382d-48ca-aa14-3e9ba4b63c98-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-rqmzz\" (UID: \"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.174968 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.358901 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf"] Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.462042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" event={"ID":"0c8c62e9-0201-43a4-b823-82af87a0977e","Type":"ContainerStarted","Data":"d6cae2c92f4d956792e194551d455fbc10f25f338829b86088953eab0f7871e9"} Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.463909 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" event={"ID":"f8de466d-f069-4a8e-8598-72a163525c24","Type":"ContainerStarted","Data":"4dd45bf4f16770003ef342051144f96b6bd4f911216d98b705f544aee98ffb8a"} Feb 20 10:11:44 crc kubenswrapper[4962]: I0220 10:11:44.686053 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz"] Feb 20 10:11:45 crc kubenswrapper[4962]: I0220 10:11:45.474833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" event={"ID":"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98","Type":"ContainerStarted","Data":"7de56f10d1f3bd38a08860549618e484c42477cb0396189ac3b1bcc230588146"} Feb 20 10:11:49 crc kubenswrapper[4962]: I0220 10:11:49.517276 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" event={"ID":"98bbcdbd-382d-48ca-aa14-3e9ba4b63c98","Type":"ContainerStarted","Data":"92f02ef4ba51657d3253e259e7fc434f227c78da0dc3958b52516ed83a52be41"} Feb 20 10:11:49 crc kubenswrapper[4962]: I0220 10:11:49.518264 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:49 crc kubenswrapper[4962]: I0220 10:11:49.582140 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" podStartSLOduration=37.582117032 podStartE2EDuration="37.582117032s" podCreationTimestamp="2026-02-20 10:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:11:49.569368174 +0000 UTC m=+1001.151840060" watchObservedRunningTime="2026-02-20 10:11:49.582117032 +0000 UTC m=+1001.164588878" Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.170841 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-d2clq" Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.546274 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" event={"ID":"f8de466d-f069-4a8e-8598-72a163525c24","Type":"ContainerStarted","Data":"d63620977efe9a7117d0118bf8371fbbec4315da5ba0d7b8d1c93e5502dc7733"} Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.546394 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.547939 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" event={"ID":"0c8c62e9-0201-43a4-b823-82af87a0977e","Type":"ContainerStarted","Data":"652ad8c6326d73ac5d1f5c1080a1e90ea6fa94709991fe368150108388a3e850"} Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.548089 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:11:52 crc kubenswrapper[4962]: I0220 10:11:52.608928 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" podStartSLOduration=34.436460643 podStartE2EDuration="41.608901999s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:44.376708001 +0000 UTC m=+995.959179847" lastFinishedPulling="2026-02-20 10:11:51.549149347 +0000 UTC m=+1003.131621203" observedRunningTime="2026-02-20 10:11:52.583876489 +0000 UTC m=+1004.166348345" watchObservedRunningTime="2026-02-20 10:11:52.608901999 +0000 UTC m=+1004.191373845" Feb 20 10:11:54 crc kubenswrapper[4962]: I0220 10:11:54.184801 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-rqmzz" Feb 20 10:11:54 crc kubenswrapper[4962]: I0220 10:11:54.229113 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" podStartSLOduration=35.645013658 podStartE2EDuration="43.229095772s" podCreationTimestamp="2026-02-20 10:11:11 +0000 UTC" firstStartedPulling="2026-02-20 10:11:43.93833514 +0000 UTC m=+995.520806976" lastFinishedPulling="2026-02-20 10:11:51.522417234 +0000 UTC m=+1003.104889090" observedRunningTime="2026-02-20 10:11:52.605064469 +0000 UTC m=+1004.187536315" watchObservedRunningTime="2026-02-20 10:11:54.229095772 +0000 UTC m=+1005.811567618" Feb 20 10:12:03 crc kubenswrapper[4962]: I0220 10:12:03.668519 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4rnhn" Feb 20 10:12:04 crc kubenswrapper[4962]: I0220 10:12:04.041068 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.958050 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.960335 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.965426 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.965544 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.965725 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.965954 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-pbqd6" Feb 20 10:12:20 crc kubenswrapper[4962]: I0220 10:12:20.968024 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.025475 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.026923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.031098 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.042934 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.112886 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.112948 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.113056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.113227 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.113330 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215625 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215693 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215734 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.215936 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.216732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.217323 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.217502 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.237581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") pod \"dnsmasq-dns-855cbc58c5-jg2pc\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.247130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") pod \"dnsmasq-dns-6fcf94d689-stkxf\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.283506 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.353309 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.590087 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.604684 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.870453 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" event={"ID":"35f03c4f-de3b-4981-9e78-b8d1a1d171b5","Type":"ContainerStarted","Data":"153e2efb6d99e57bdd8c71d555149537a37f6ac8ec26c3492f416c36ef39e106"} Feb 20 10:12:21 crc kubenswrapper[4962]: I0220 10:12:21.889005 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:21 crc kubenswrapper[4962]: W0220 10:12:21.903990 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd99be0c7_0310_4fa4_9426_63be765a9e85.slice/crio-c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc WatchSource:0}: Error finding container c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc: Status 404 returned error can't find the container with id c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc Feb 20 10:12:22 crc kubenswrapper[4962]: I0220 10:12:22.883227 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" event={"ID":"d99be0c7-0310-4fa4-9426-63be765a9e85","Type":"ContainerStarted","Data":"c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc"} Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.038557 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.070481 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.072735 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.098842 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.155906 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.155977 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.156009 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.263312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.263370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.263409 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.264613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.266221 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.289818 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") pod \"dnsmasq-dns-f54874ffc-2drvh\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.394187 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.831129 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.852838 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.855526 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.878452 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.878526 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.878561 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.882047 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.894675 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.980464 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.980581 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.980630 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.981897 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:23 crc kubenswrapper[4962]: I0220 10:12:23.984734 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.009368 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") pod \"dnsmasq-dns-67ff45466c-svjlj\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.192468 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.223871 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.225111 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.229692 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.230015 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.231931 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.231973 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-d7jzr" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.232018 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.232298 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.232430 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.239666 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285476 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285544 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285612 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285647 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285675 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285757 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285779 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.285805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388062 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388694 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388719 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388754 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388804 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.388970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.389013 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.389047 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.389957 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.390271 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.390641 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.390993 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.396172 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.396260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.398775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.399652 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.399736 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.400652 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.412051 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.466836 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.572188 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.754304 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:12:24 crc kubenswrapper[4962]: W0220 10:12:24.783175 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb061854a_f0c6_4754_a947_a7d5408f25db.slice/crio-2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf WatchSource:0}: Error finding container 2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf: Status 404 returned error can't find the container with id 2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.934820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" event={"ID":"b061854a-f0c6-4754-a947-a7d5408f25db","Type":"ContainerStarted","Data":"2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf"} Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.937715 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" event={"ID":"01d0cdce-fd47-471a-94af-ee68fed6a2aa","Type":"ContainerStarted","Data":"6ee62349849ee2a01e9e7674d3fdcbef155f78a8a88598da3702e8fea9005811"} Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.977730 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.980369 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.983710 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.983873 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.983947 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.984907 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.985690 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.987445 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tbhds" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.987706 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 10:12:24 crc kubenswrapper[4962]: I0220 10:12:24.998997 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001391 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001425 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001457 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001473 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001506 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.001525 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.003646 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.003756 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.003778 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.003801 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.085169 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:12:25 crc kubenswrapper[4962]: W0220 10:12:25.085725 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a8d652d_aea8_4a83_b33e_0d2522af0be8.slice/crio-b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c WatchSource:0}: Error finding container b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c: Status 404 returned error can't find the container with id b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106456 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106603 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106669 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106693 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106720 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106757 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106781 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106803 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106853 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.106875 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.109264 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.109285 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.110544 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.110775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.110887 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.113190 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.115833 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.119292 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.122749 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.122876 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.130438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.149496 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.320156 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.935016 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:12:25 crc kubenswrapper[4962]: W0220 10:12:25.966047 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56a77dd3_ef10_46a6_a00d_ab38af0d4338.slice/crio-1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5 WatchSource:0}: Error finding container 1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5: Status 404 returned error can't find the container with id 1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5 Feb 20 10:12:25 crc kubenswrapper[4962]: I0220 10:12:25.972794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerStarted","Data":"b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c"} Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.396745 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.398367 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.405806 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.405926 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-898r2" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.407036 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.408027 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.410383 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.411760 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537447 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537569 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537619 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537641 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537669 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537692 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.537720 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638702 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638759 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638793 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638868 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638894 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638929 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.638965 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.642253 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.642956 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.643435 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.643450 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.643758 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.671802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.685692 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.706928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.712821 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " pod="openstack/openstack-galera-0" Feb 20 10:12:26 crc kubenswrapper[4962]: I0220 10:12:26.729909 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.007199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerStarted","Data":"1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5"} Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.477962 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.746888 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.749838 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.754298 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.756356 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.757156 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4svgj" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.759193 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.759296 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868197 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868262 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868318 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868345 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868435 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.868467 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.870560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.870625 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.972970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973042 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973103 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973153 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973177 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.973193 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.974560 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.975028 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.975045 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.975661 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.976361 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.983776 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.991424 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:27 crc kubenswrapper[4962]: I0220 10:12:27.993837 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.026953 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.075321 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.083689 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.088046 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.090842 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.091141 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-njzvl" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.091512 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.094285 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176690 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176726 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176755 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.176790 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282348 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.282527 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.283387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.283923 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.300669 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.303654 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.307709 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " pod="openstack/memcached-0" Feb 20 10:12:28 crc kubenswrapper[4962]: I0220 10:12:28.416053 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.245897 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.247176 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.253092 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5f72b" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.256143 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.336498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") pod \"kube-state-metrics-0\" (UID: \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\") " pod="openstack/kube-state-metrics-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.438811 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") pod \"kube-state-metrics-0\" (UID: \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\") " pod="openstack/kube-state-metrics-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.461654 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") pod \"kube-state-metrics-0\" (UID: \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\") " pod="openstack/kube-state-metrics-0" Feb 20 10:12:30 crc kubenswrapper[4962]: I0220 10:12:30.575300 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.801760 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.803302 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.805906 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5c7mx" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.806076 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.806768 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.816680 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.847905 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.847975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848006 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848180 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.848208 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.920876 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.926540 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.944136 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949839 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949886 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949909 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949931 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949953 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.949986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.950002 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.952036 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.952044 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.952219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.952333 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.960162 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.962035 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.964898 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.965645 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.966224 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.966458 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gw5b6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.966608 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.966751 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.968848 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.973219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:33 crc kubenswrapper[4962]: I0220 10:12:33.986455 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") pod \"ovn-controller-wj9f6\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051489 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051548 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051578 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051627 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051666 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051685 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051752 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051775 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051797 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051820 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051841 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.051882 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.083233 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerStarted","Data":"9629cf6fabd95f146380c31c7bc910c7de73918acc62bb7e7fbe72c4774cfa18"} Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154082 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154657 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154682 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154706 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154769 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154798 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154830 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154852 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154883 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154940 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.154977 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.155020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.155037 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.155491 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.156035 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.157117 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.157236 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.157329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.160230 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.160635 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.162859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.168702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.168981 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.171187 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.171277 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.171709 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.176228 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") pod \"ovn-controller-ovs-r7g9h\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.180651 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.183928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.255982 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:34 crc kubenswrapper[4962]: I0220 10:12:34.323548 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.926919 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.930492 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.936315 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.936454 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-58l5v" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.937288 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.946140 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 20 10:12:37 crc kubenswrapper[4962]: I0220 10:12:37.961214 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.130774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.130887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.131286 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.131733 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.132043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.132173 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.132202 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.132377 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234793 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234916 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234948 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.234986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.235012 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.235515 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.235907 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.238469 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.239817 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.249740 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.250061 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.254403 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.257207 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.268995 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:38 crc kubenswrapper[4962]: I0220 10:12:38.277298 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 10:12:41 crc kubenswrapper[4962]: I0220 10:12:41.507843 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:12:41 crc kubenswrapper[4962]: I0220 10:12:41.508138 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.900721 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.900721 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.901370 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2hckp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(56a77dd3-ef10-46a6-a00d-ab38af0d4338): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.901508 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rcvhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(2a8d652d-aea8-4a83-b33e-0d2522af0be8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.903916 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" Feb 20 10:12:46 crc kubenswrapper[4962]: E0220 10:12:46.903921 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.216548 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76\\\"\"" pod="openstack/rabbitmq-server-0" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.216646 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.843189 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.843471 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4m85l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-jg2pc_openstack(35f03c4f-de3b-4981-9e78-b8d1a1d171b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:47 crc kubenswrapper[4962]: E0220 10:12:47.844769 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" podUID="35f03c4f-de3b-4981-9e78-b8d1a1d171b5" Feb 20 10:12:49 crc kubenswrapper[4962]: E0220 10:12:49.975458 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 10:12:49 crc kubenswrapper[4962]: E0220 10:12:49.976468 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qd6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-stkxf_openstack(d99be0c7-0310-4fa4-9426-63be765a9e85): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:49 crc kubenswrapper[4962]: E0220 10:12:49.978038 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" podUID="d99be0c7-0310-4fa4-9426-63be765a9e85" Feb 20 10:12:49 crc kubenswrapper[4962]: E0220 10:12:49.999367 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:49.999819 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prlr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-svjlj_openstack(b061854a-f0c6-4754-a947-a7d5408f25db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.003798 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" podUID="b061854a-f0c6-4754-a947-a7d5408f25db" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.052433 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.052699 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5dbrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-f54874ffc-2drvh_openstack(01d0cdce-fd47-471a-94af-ee68fed6a2aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.053994 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" podUID="01d0cdce-fd47-471a-94af-ee68fed6a2aa" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.148051 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.241180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" event={"ID":"35f03c4f-de3b-4981-9e78-b8d1a1d171b5","Type":"ContainerDied","Data":"153e2efb6d99e57bdd8c71d555149537a37f6ac8ec26c3492f416c36ef39e106"} Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.241248 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-jg2pc" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.242281 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" podUID="01d0cdce-fd47-471a-94af-ee68fed6a2aa" Feb 20 10:12:50 crc kubenswrapper[4962]: E0220 10:12:50.242764 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" podUID="b061854a-f0c6-4754-a947-a7d5408f25db" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.311135 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") pod \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.311656 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") pod \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\" (UID: \"35f03c4f-de3b-4981-9e78-b8d1a1d171b5\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.313126 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config" (OuterVolumeSpecName: "config") pod "35f03c4f-de3b-4981-9e78-b8d1a1d171b5" (UID: "35f03c4f-de3b-4981-9e78-b8d1a1d171b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.322257 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l" (OuterVolumeSpecName: "kube-api-access-4m85l") pod "35f03c4f-de3b-4981-9e78-b8d1a1d171b5" (UID: "35f03c4f-de3b-4981-9e78-b8d1a1d171b5"). InnerVolumeSpecName "kube-api-access-4m85l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.413310 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.413345 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m85l\" (UniqueName: \"kubernetes.io/projected/35f03c4f-de3b-4981-9e78-b8d1a1d171b5-kube-api-access-4m85l\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.453429 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.588832 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.608624 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.628825 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.635425 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-jg2pc"] Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.748320 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:12:50 crc kubenswrapper[4962]: W0220 10:12:50.773796 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7df7b95_a5ed_4e4e_81f0_9f718bab0bcc.slice/crio-13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d WatchSource:0}: Error finding container 13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d: Status 404 returned error can't find the container with id 13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.803124 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.923869 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") pod \"d99be0c7-0310-4fa4-9426-63be765a9e85\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924410 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") pod \"d99be0c7-0310-4fa4-9426-63be765a9e85\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") pod \"d99be0c7-0310-4fa4-9426-63be765a9e85\" (UID: \"d99be0c7-0310-4fa4-9426-63be765a9e85\") " Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924473 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d99be0c7-0310-4fa4-9426-63be765a9e85" (UID: "d99be0c7-0310-4fa4-9426-63be765a9e85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924893 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config" (OuterVolumeSpecName: "config") pod "d99be0c7-0310-4fa4-9426-63be765a9e85" (UID: "d99be0c7-0310-4fa4-9426-63be765a9e85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:12:50 crc kubenswrapper[4962]: I0220 10:12:50.924942 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.739400 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d99be0c7-0310-4fa4-9426-63be765a9e85-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.756055 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f03c4f-de3b-4981-9e78-b8d1a1d171b5" path="/var/lib/kubelet/pods/35f03c4f-de3b-4981-9e78-b8d1a1d171b5/volumes" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.778660 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h" (OuterVolumeSpecName: "kube-api-access-4qd6h") pod "d99be0c7-0310-4fa4-9426-63be765a9e85" (UID: "d99be0c7-0310-4fa4-9426-63be765a9e85"). InnerVolumeSpecName "kube-api-access-4qd6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.786703 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.823962 4962 kubelet_pods.go:2476] "Failed to reduce cpu time for pod pending volume cleanup" podUID="d99be0c7-0310-4fa4-9426-63be765a9e85" err="openat2 /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd99be0c7_0310_4fa4_9426_63be765a9e85.slice/cgroup.controllers: no such file or directory" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824049 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerStarted","Data":"f6e6a97dcf3e2888aaf774e41bb7caae5d9537602046e6592fd534041d6392a2"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824076 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerStarted","Data":"d0a92b505f163c98c2579b38133407e2587dcd82e4a7d6302d1e3ca2e2112d68"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824086 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-stkxf" event={"ID":"d99be0c7-0310-4fa4-9426-63be765a9e85","Type":"ContainerDied","Data":"c8edb9d91df0b22ff9505de5c7b80ee2daed2d5e1bd99677fbcf778baf5be2bc"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824102 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6" event={"ID":"383d4f1e-72b3-48ce-9427-0361c19e41fc","Type":"ContainerStarted","Data":"1ed5bd754fe42b78759f03224b6a39f1b92d8d484574e9a6557ab622debe2a23"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.824115 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.844348 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qd6h\" (UniqueName: \"kubernetes.io/projected/d99be0c7-0310-4fa4-9426-63be765a9e85-kube-api-access-4qd6h\") on node \"crc\" DevicePath \"\"" Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.868678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerStarted","Data":"5738934c1190f3f4ebf6be3609b1f56189c1c53ad8ccc9348121e92913c3ec72"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.874662 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b22a9e86-ccdf-4505-8116-21b0230943fc","Type":"ContainerStarted","Data":"527bc0b9350edbbd23edfe05a933e12b44f8d4ad0c70495feffaffb9052c4070"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.875772 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc","Type":"ContainerStarted","Data":"13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d"} Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.919672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:12:51 crc kubenswrapper[4962]: I0220 10:12:51.993503 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:52 crc kubenswrapper[4962]: I0220 10:12:52.007335 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-stkxf"] Feb 20 10:12:52 crc kubenswrapper[4962]: I0220 10:12:52.810188 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:12:52 crc kubenswrapper[4962]: I0220 10:12:52.887325 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerStarted","Data":"a2d2a8a63bf5c9ebd610b16b09ca46a05d03ae717f57b9ce876334d685870041"} Feb 20 10:12:52 crc kubenswrapper[4962]: I0220 10:12:52.889954 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerStarted","Data":"2226a3425cb913ac33dc3114a16db2100facfc7423dff93548d53775b718e6e2"} Feb 20 10:12:53 crc kubenswrapper[4962]: I0220 10:12:53.152872 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d99be0c7-0310-4fa4-9426-63be765a9e85" path="/var/lib/kubelet/pods/d99be0c7-0310-4fa4-9426-63be765a9e85/volumes" Feb 20 10:12:53 crc kubenswrapper[4962]: I0220 10:12:53.902835 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerStarted","Data":"b4d03ac8272f687d64246b8c3c40efcac57552a3657ef2ee1db4c3625f47035c"} Feb 20 10:12:54 crc kubenswrapper[4962]: I0220 10:12:54.914308 4962 generic.go:334] "Generic (PLEG): container finished" podID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerID="5738934c1190f3f4ebf6be3609b1f56189c1c53ad8ccc9348121e92913c3ec72" exitCode=0 Feb 20 10:12:54 crc kubenswrapper[4962]: I0220 10:12:54.914398 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerDied","Data":"5738934c1190f3f4ebf6be3609b1f56189c1c53ad8ccc9348121e92913c3ec72"} Feb 20 10:12:55 crc kubenswrapper[4962]: I0220 10:12:55.924817 4962 generic.go:334] "Generic (PLEG): container finished" podID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerID="f6e6a97dcf3e2888aaf774e41bb7caae5d9537602046e6592fd534041d6392a2" exitCode=0 Feb 20 10:12:55 crc kubenswrapper[4962]: I0220 10:12:55.925059 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerDied","Data":"f6e6a97dcf3e2888aaf774e41bb7caae5d9537602046e6592fd534041d6392a2"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.949712 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerStarted","Data":"b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.952575 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerID="b6626b3616a8427737e8c790adcc57ad3f4d0385df8b472ffc49fd4bd021b003" exitCode=0 Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.952724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerDied","Data":"b6626b3616a8427737e8c790adcc57ad3f4d0385df8b472ffc49fd4bd021b003"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.956826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerStarted","Data":"a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.958955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6" event={"ID":"383d4f1e-72b3-48ce-9427-0361c19e41fc","Type":"ContainerStarted","Data":"d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.959472 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wj9f6" Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.963152 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerStarted","Data":"9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.968167 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerStarted","Data":"0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.981315 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b22a9e86-ccdf-4505-8116-21b0230943fc","Type":"ContainerStarted","Data":"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.981417 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.984433 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc","Type":"ContainerStarted","Data":"fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a"} Feb 20 10:12:57 crc kubenswrapper[4962]: I0220 10:12:57.984584 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.036888 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=16.679746766 podStartE2EDuration="33.036864101s" podCreationTimestamp="2026-02-20 10:12:25 +0000 UTC" firstStartedPulling="2026-02-20 10:12:33.695694656 +0000 UTC m=+1045.278166502" lastFinishedPulling="2026-02-20 10:12:50.052811981 +0000 UTC m=+1061.635283837" observedRunningTime="2026-02-20 10:12:58.034323042 +0000 UTC m=+1069.616794888" watchObservedRunningTime="2026-02-20 10:12:58.036864101 +0000 UTC m=+1069.619335947" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.038024 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=32.038017247 podStartE2EDuration="32.038017247s" podCreationTimestamp="2026-02-20 10:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:12:58.003311786 +0000 UTC m=+1069.585783672" watchObservedRunningTime="2026-02-20 10:12:58.038017247 +0000 UTC m=+1069.620489093" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.063979 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wj9f6" podStartSLOduration=19.280038419 podStartE2EDuration="25.063944944s" podCreationTimestamp="2026-02-20 10:12:33 +0000 UTC" firstStartedPulling="2026-02-20 10:12:50.592826298 +0000 UTC m=+1062.175298164" lastFinishedPulling="2026-02-20 10:12:56.376732843 +0000 UTC m=+1067.959204689" observedRunningTime="2026-02-20 10:12:58.059106974 +0000 UTC m=+1069.641578830" watchObservedRunningTime="2026-02-20 10:12:58.063944944 +0000 UTC m=+1069.646416790" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.077309 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.077350 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.083755 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=21.721689741 podStartE2EDuration="28.083741521s" podCreationTimestamp="2026-02-20 10:12:30 +0000 UTC" firstStartedPulling="2026-02-20 10:12:50.780056258 +0000 UTC m=+1062.362528104" lastFinishedPulling="2026-02-20 10:12:57.142108008 +0000 UTC m=+1068.724579884" observedRunningTime="2026-02-20 10:12:58.081058587 +0000 UTC m=+1069.663530433" watchObservedRunningTime="2026-02-20 10:12:58.083741521 +0000 UTC m=+1069.666213367" Feb 20 10:12:58 crc kubenswrapper[4962]: I0220 10:12:58.111272 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.497626665 podStartE2EDuration="30.111238517s" podCreationTimestamp="2026-02-20 10:12:28 +0000 UTC" firstStartedPulling="2026-02-20 10:12:50.470239901 +0000 UTC m=+1062.052711747" lastFinishedPulling="2026-02-20 10:12:56.083851743 +0000 UTC m=+1067.666323599" observedRunningTime="2026-02-20 10:12:58.101938228 +0000 UTC m=+1069.684410094" watchObservedRunningTime="2026-02-20 10:12:58.111238517 +0000 UTC m=+1069.693710403" Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.004524 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerStarted","Data":"fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38"} Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.005686 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.006117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerStarted","Data":"0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2"} Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.038198 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-r7g9h" podStartSLOduration=22.673265496 podStartE2EDuration="26.038169832s" podCreationTimestamp="2026-02-20 10:12:33 +0000 UTC" firstStartedPulling="2026-02-20 10:12:53.652646463 +0000 UTC m=+1065.235118309" lastFinishedPulling="2026-02-20 10:12:57.017550799 +0000 UTC m=+1068.600022645" observedRunningTime="2026-02-20 10:12:59.031587518 +0000 UTC m=+1070.614059374" watchObservedRunningTime="2026-02-20 10:12:59.038169832 +0000 UTC m=+1070.620641688" Feb 20 10:12:59 crc kubenswrapper[4962]: I0220 10:12:59.256844 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:13:00 crc kubenswrapper[4962]: I0220 10:13:00.018833 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerStarted","Data":"e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed"} Feb 20 10:13:00 crc kubenswrapper[4962]: I0220 10:13:00.022876 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerStarted","Data":"a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8"} Feb 20 10:13:00 crc kubenswrapper[4962]: I0220 10:13:00.095102 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.579625857 podStartE2EDuration="24.095065905s" podCreationTimestamp="2026-02-20 10:12:36 +0000 UTC" firstStartedPulling="2026-02-20 10:12:51.868226024 +0000 UTC m=+1063.450697870" lastFinishedPulling="2026-02-20 10:12:59.383666062 +0000 UTC m=+1070.966137918" observedRunningTime="2026-02-20 10:13:00.086304042 +0000 UTC m=+1071.668775918" watchObservedRunningTime="2026-02-20 10:13:00.095065905 +0000 UTC m=+1071.677537791" Feb 20 10:13:00 crc kubenswrapper[4962]: I0220 10:13:00.100891 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=20.68500445 podStartE2EDuration="28.100839345s" podCreationTimestamp="2026-02-20 10:12:32 +0000 UTC" firstStartedPulling="2026-02-20 10:12:51.961241891 +0000 UTC m=+1063.543713737" lastFinishedPulling="2026-02-20 10:12:59.377076786 +0000 UTC m=+1070.959548632" observedRunningTime="2026-02-20 10:13:00.049134665 +0000 UTC m=+1071.631606511" watchObservedRunningTime="2026-02-20 10:13:00.100839345 +0000 UTC m=+1071.683311221" Feb 20 10:13:01 crc kubenswrapper[4962]: I0220 10:13:01.324105 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 20 10:13:01 crc kubenswrapper[4962]: I0220 10:13:01.372380 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.049535 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerStarted","Data":"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e"} Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.053576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerStarted","Data":"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857"} Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.054124 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.122308 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.277652 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.379259 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.453171 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.474677 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.476206 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.478425 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.499691 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.532685 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.534749 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.545021 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.567056 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628098 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628166 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628288 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628327 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628374 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628418 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628439 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628456 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.628483 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730637 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730717 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730766 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730797 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730849 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730891 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730928 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.730992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731015 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731043 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731338 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731887 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.731915 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.732259 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.738223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.739930 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.740206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.741581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.749991 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") pod \"ovn-controller-metrics-k7csj\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.750717 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") pod \"dnsmasq-dns-57bdd75c-mqxqd\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.829931 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.844678 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.858719 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.939819 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") pod \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.939947 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") pod \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.939982 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") pod \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\" (UID: \"01d0cdce-fd47-471a-94af-ee68fed6a2aa\") " Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.940937 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01d0cdce-fd47-471a-94af-ee68fed6a2aa" (UID: "01d0cdce-fd47-471a-94af-ee68fed6a2aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.941350 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config" (OuterVolumeSpecName: "config") pod "01d0cdce-fd47-471a-94af-ee68fed6a2aa" (UID: "01d0cdce-fd47-471a-94af-ee68fed6a2aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.953177 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr" (OuterVolumeSpecName: "kube-api-access-5dbrr") pod "01d0cdce-fd47-471a-94af-ee68fed6a2aa" (UID: "01d0cdce-fd47-471a-94af-ee68fed6a2aa"). InnerVolumeSpecName "kube-api-access-5dbrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.962967 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:13:02 crc kubenswrapper[4962]: I0220 10:13:02.996700 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.006034 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.009619 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.049345 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.052051 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.052097 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dbrr\" (UniqueName: \"kubernetes.io/projected/01d0cdce-fd47-471a-94af-ee68fed6a2aa-kube-api-access-5dbrr\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.052113 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d0cdce-fd47-471a-94af-ee68fed6a2aa-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.101074 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" event={"ID":"01d0cdce-fd47-471a-94af-ee68fed6a2aa","Type":"ContainerDied","Data":"6ee62349849ee2a01e9e7674d3fdcbef155f78a8a88598da3702e8fea9005811"} Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.101171 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-2drvh" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.102001 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153322 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153421 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153455 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.153532 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.158822 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.212769 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.219706 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-2drvh"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.254873 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.254968 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.255076 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.255131 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.255181 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.258326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.258876 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.259123 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.259724 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.280937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") pod \"dnsmasq-dns-75b7bcc64f-ms8hz\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.396782 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.414300 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.421544 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.465682 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.488382 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.497469 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.504742 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.505458 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.505584 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-knfns" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.505711 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.527397 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:13:03 crc kubenswrapper[4962]: W0220 10:13:03.537930 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88c21489_524e_4ee7_a340_5be2573af161.slice/crio-f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57 WatchSource:0}: Error finding container f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57: Status 404 returned error can't find the container with id f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57 Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.557346 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.567476 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") pod \"b061854a-f0c6-4754-a947-a7d5408f25db\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.567528 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") pod \"b061854a-f0c6-4754-a947-a7d5408f25db\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.567600 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") pod \"b061854a-f0c6-4754-a947-a7d5408f25db\" (UID: \"b061854a-f0c6-4754-a947-a7d5408f25db\") " Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.567960 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568045 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568107 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568157 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568217 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568236 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.568288 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config" (OuterVolumeSpecName: "config") pod "b061854a-f0c6-4754-a947-a7d5408f25db" (UID: "b061854a-f0c6-4754-a947-a7d5408f25db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.572917 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9" (OuterVolumeSpecName: "kube-api-access-prlr9") pod "b061854a-f0c6-4754-a947-a7d5408f25db" (UID: "b061854a-f0c6-4754-a947-a7d5408f25db"). InnerVolumeSpecName "kube-api-access-prlr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.573176 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b061854a-f0c6-4754-a947-a7d5408f25db" (UID: "b061854a-f0c6-4754-a947-a7d5408f25db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670481 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670564 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670670 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670695 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670795 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670877 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670893 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b061854a-f0c6-4754-a947-a7d5408f25db-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.670905 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prlr9\" (UniqueName: \"kubernetes.io/projected/b061854a-f0c6-4754-a947-a7d5408f25db-kube-api-access-prlr9\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.673078 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.673549 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.676500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.686613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.686984 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.690057 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.693095 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") pod \"ovn-northd-0\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.833005 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 10:13:03 crc kubenswrapper[4962]: I0220 10:13:03.959435 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.105434 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerStarted","Data":"5a4113e8006a84520a74694b80780b48a9159ec2ba04b9aa6174205d45e900e7"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.107393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerStarted","Data":"bdc0784e8ac6a8e38cc361b433d0c6167f165dee537a2968d10b45106c2fa62c"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.108707 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" event={"ID":"b061854a-f0c6-4754-a947-a7d5408f25db","Type":"ContainerDied","Data":"2cc17bab5a42964e336fc25e4f9d43353486628812f1bad5c2e3d1b82435adbf"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.108760 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-svjlj" Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.110957 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7csj" event={"ID":"88c21489-524e-4ee7-a340-5be2573af161","Type":"ContainerStarted","Data":"c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.110982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7csj" event={"ID":"88c21489-524e-4ee7-a340-5be2573af161","Type":"ContainerStarted","Data":"f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57"} Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.131629 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-k7csj" podStartSLOduration=2.131563135 podStartE2EDuration="2.131563135s" podCreationTimestamp="2026-02-20 10:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:13:04.130498601 +0000 UTC m=+1075.712970447" watchObservedRunningTime="2026-02-20 10:13:04.131563135 +0000 UTC m=+1075.714034981" Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.183946 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.189746 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-svjlj"] Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.335832 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:13:04 crc kubenswrapper[4962]: W0220 10:13:04.349613 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33d73a04_08b2_4944_861f_749a63c2565d.slice/crio-5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b WatchSource:0}: Error finding container 5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b: Status 404 returned error can't find the container with id 5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.738271 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 20 10:13:04 crc kubenswrapper[4962]: I0220 10:13:04.874877 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.125561 4962 generic.go:334] "Generic (PLEG): container finished" podID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerID="8b1c406d73d48e9a02cd34de0f3b729ea2c65e81565abd3265c085cac257a091" exitCode=0 Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.125747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerDied","Data":"8b1c406d73d48e9a02cd34de0f3b729ea2c65e81565abd3265c085cac257a091"} Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.128935 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerStarted","Data":"5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b"} Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.131608 4962 generic.go:334] "Generic (PLEG): container finished" podID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerID="d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb" exitCode=0 Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.131681 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerDied","Data":"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb"} Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.175394 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d0cdce-fd47-471a-94af-ee68fed6a2aa" path="/var/lib/kubelet/pods/01d0cdce-fd47-471a-94af-ee68fed6a2aa/volumes" Feb 20 10:13:05 crc kubenswrapper[4962]: I0220 10:13:05.176539 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b061854a-f0c6-4754-a947-a7d5408f25db" path="/var/lib/kubelet/pods/b061854a-f0c6-4754-a947-a7d5408f25db/volumes" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.142776 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerStarted","Data":"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453"} Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.143178 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerStarted","Data":"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe"} Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.143201 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.145347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerStarted","Data":"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80"} Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.145481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.147894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerStarted","Data":"060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd"} Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.148396 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.178814 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.97981958 podStartE2EDuration="3.178782077s" podCreationTimestamp="2026-02-20 10:13:03 +0000 UTC" firstStartedPulling="2026-02-20 10:13:04.355904721 +0000 UTC m=+1075.938376587" lastFinishedPulling="2026-02-20 10:13:05.554867228 +0000 UTC m=+1077.137339084" observedRunningTime="2026-02-20 10:13:06.174874025 +0000 UTC m=+1077.757345901" watchObservedRunningTime="2026-02-20 10:13:06.178782077 +0000 UTC m=+1077.761253933" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.204161 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" podStartSLOduration=3.247353962 podStartE2EDuration="4.204138097s" podCreationTimestamp="2026-02-20 10:13:02 +0000 UTC" firstStartedPulling="2026-02-20 10:13:03.476801295 +0000 UTC m=+1075.059273141" lastFinishedPulling="2026-02-20 10:13:04.43358541 +0000 UTC m=+1076.016057276" observedRunningTime="2026-02-20 10:13:06.202495275 +0000 UTC m=+1077.784967131" watchObservedRunningTime="2026-02-20 10:13:06.204138097 +0000 UTC m=+1077.786609953" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.228032 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" podStartSLOduration=3.6711040280000002 podStartE2EDuration="4.22800569s" podCreationTimestamp="2026-02-20 10:13:02 +0000 UTC" firstStartedPulling="2026-02-20 10:13:03.978682765 +0000 UTC m=+1075.561154611" lastFinishedPulling="2026-02-20 10:13:04.535584427 +0000 UTC m=+1076.118056273" observedRunningTime="2026-02-20 10:13:06.22255438 +0000 UTC m=+1077.805026236" watchObservedRunningTime="2026-02-20 10:13:06.22800569 +0000 UTC m=+1077.810477536" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.730317 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.730926 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.840234 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.842352 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.847492 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.849210 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:06 crc kubenswrapper[4962]: I0220 10:13:06.903520 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.051710 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.051875 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.155253 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.155540 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.158802 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.196445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") pod \"root-account-create-update-48c8f\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.338319 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 20 10:13:07 crc kubenswrapper[4962]: I0220 10:13:07.475539 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:08 crc kubenswrapper[4962]: I0220 10:13:08.048216 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:08 crc kubenswrapper[4962]: I0220 10:13:08.184695 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48c8f" event={"ID":"cb73a133-7ca1-492e-ac32-fb33d6c335ba","Type":"ContainerStarted","Data":"f7f2e02b137d5913310205dbb052550410f12f1f612581277520789b3dc50ed0"} Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.433359 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.435986 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.443517 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.532822 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.534032 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.535860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.536049 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.538397 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.544231 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.638104 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.638200 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.638264 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.638325 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.639393 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.650334 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.651935 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.661740 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.665681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") pod \"keystone-db-create-svsfg\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.746808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.756067 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.761301 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.761848 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.765281 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.767102 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.770814 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.780100 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") pod \"keystone-125a-account-create-update-bd2q8\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.784810 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.856966 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.858030 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.858193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.960313 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.961331 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.965050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.965162 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.973928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:09 crc kubenswrapper[4962]: I0220 10:13:09.989481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") pod \"placement-db-create-zfmzb\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.067375 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.067485 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.068368 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.081689 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.099906 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") pod \"placement-d8b3-account-create-update-br2xj\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.215058 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.237185 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:13:10 crc kubenswrapper[4962]: W0220 10:13:10.257185 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c97128d_8360_482e_b05b_6025d046c122.slice/crio-5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36 WatchSource:0}: Error finding container 5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36: Status 404 returned error can't find the container with id 5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36 Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.383931 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:13:10 crc kubenswrapper[4962]: W0220 10:13:10.406093 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3d903f3_8f86_49e2_848b_4a59a9068b75.slice/crio-c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b WatchSource:0}: Error finding container c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b: Status 404 returned error can't find the container with id c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.523747 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.524162 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" containerID="cri-o://060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd" gracePeriod=10 Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.528722 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.541896 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.557083 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.559026 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.579779 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.607040 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679714 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679804 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679853 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679872 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.679899 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.725991 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781476 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781572 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781614 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.781723 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.782719 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.783260 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.784477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.784805 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.830554 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") pod \"dnsmasq-dns-689df5d84f-qp96t\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:10 crc kubenswrapper[4962]: I0220 10:13:10.899839 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.219161 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zfmzb" event={"ID":"2b915fcc-cf15-43c3-97c6-bde3a29da796","Type":"ContainerStarted","Data":"b6c450bcf92382b603afb108ffce51277656e7c13205fe6efdfb5f56cb4f3fad"} Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.220852 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-125a-account-create-update-bd2q8" event={"ID":"a3d903f3-8f86-49e2-848b-4a59a9068b75","Type":"ContainerStarted","Data":"c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b"} Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.221814 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-svsfg" event={"ID":"9c97128d-8360-482e-b05b-6025d046c122","Type":"ContainerStarted","Data":"5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36"} Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.222624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8b3-account-create-update-br2xj" event={"ID":"7c7420bd-d4ef-4511-acf4-a132ad0a5677","Type":"ContainerStarted","Data":"b2785a54fb58f00cffa05ba7e64b052a132025e8e4e4c971af47565aa7808a85"} Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.372166 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:13:11 crc kubenswrapper[4962]: W0220 10:13:11.381776 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podced7b045_00ec_453d_9a56_b13132991e8c.slice/crio-e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885 WatchSource:0}: Error finding container e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885: Status 404 returned error can't find the container with id e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885 Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.507886 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.507965 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.726899 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.736918 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.756493 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.757103 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.757402 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-fvgjk" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.760288 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.780265 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913251 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913651 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913756 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913805 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.913846 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.968935 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.971775 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.974311 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.975070 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.975207 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 20 10:13:11 crc kubenswrapper[4962]: I0220 10:13:11.981603 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017119 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.016227 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017629 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017715 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017765 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.017808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.018157 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.018236 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.018255 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.018316 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:12.518296295 +0000 UTC m=+1084.100768141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.018338 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.023068 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.038343 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.062553 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120267 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120472 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120496 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.120767 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223331 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223419 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223538 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223803 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.223964 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.238347 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48c8f" event={"ID":"cb73a133-7ca1-492e-ac32-fb33d6c335ba","Type":"ContainerStarted","Data":"1b60442fa3cb970cd1e3424fd12f2f5e98e959daa205ca4e27a8e01da1487e66"} Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.240058 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerStarted","Data":"e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885"} Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.254477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.254681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.255252 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.255519 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.256055 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.257789 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.258770 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") pod \"swift-ring-rebalance-9mznb\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.292981 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.531773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.532071 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.532276 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: E0220 10:13:12.532349 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:13.532325272 +0000 UTC m=+1085.114797128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.856567 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:13:12 crc kubenswrapper[4962]: I0220 10:13:12.860878 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.254310 4962 generic.go:334] "Generic (PLEG): container finished" podID="ced7b045-00ec-453d-9a56-b13132991e8c" containerID="7ec00e9b2989f478e117f8d08060d562a11edc343ce310e24fa477348d6aca1b" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.254378 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerDied","Data":"7ec00e9b2989f478e117f8d08060d562a11edc343ce310e24fa477348d6aca1b"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.258306 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c97128d-8360-482e-b05b-6025d046c122" containerID="8e1e57cd49c915d1862d936053074b6280af762ac9dd3bf4c1c80c561fca009f" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.258389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-svsfg" event={"ID":"9c97128d-8360-482e-b05b-6025d046c122","Type":"ContainerDied","Data":"8e1e57cd49c915d1862d936053074b6280af762ac9dd3bf4c1c80c561fca009f"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.265483 4962 generic.go:334] "Generic (PLEG): container finished" podID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerID="060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.265596 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerDied","Data":"060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.271649 4962 generic.go:334] "Generic (PLEG): container finished" podID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" containerID="109a3b4f30138b426060ee3960875f54b8e50460794fa326f4252e9233232cac" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.271740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8b3-account-create-update-br2xj" event={"ID":"7c7420bd-d4ef-4511-acf4-a132ad0a5677","Type":"ContainerDied","Data":"109a3b4f30138b426060ee3960875f54b8e50460794fa326f4252e9233232cac"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.278751 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9mznb" event={"ID":"2e7338a7-4012-439d-b961-6ca0c55dd6e6","Type":"ContainerStarted","Data":"8c010bf1bf294a17a678e18297fc5c3ac174eb1389754dd15038dc4e26f7804b"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.291316 4962 generic.go:334] "Generic (PLEG): container finished" podID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" containerID="1b60442fa3cb970cd1e3424fd12f2f5e98e959daa205ca4e27a8e01da1487e66" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.291401 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48c8f" event={"ID":"cb73a133-7ca1-492e-ac32-fb33d6c335ba","Type":"ContainerDied","Data":"1b60442fa3cb970cd1e3424fd12f2f5e98e959daa205ca4e27a8e01da1487e66"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.297031 4962 generic.go:334] "Generic (PLEG): container finished" podID="2b915fcc-cf15-43c3-97c6-bde3a29da796" containerID="a637cdafdb841809ec5f95151c668e1d8c78d29aabd8a60383f137a82dcb2009" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.297116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zfmzb" event={"ID":"2b915fcc-cf15-43c3-97c6-bde3a29da796","Type":"ContainerDied","Data":"a637cdafdb841809ec5f95151c668e1d8c78d29aabd8a60383f137a82dcb2009"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.304114 4962 generic.go:334] "Generic (PLEG): container finished" podID="a3d903f3-8f86-49e2-848b-4a59a9068b75" containerID="1dbe5f8319feef22f1ef43626823510cfe8e71d6a8d49cafca70087ce33b1b60" exitCode=0 Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.304211 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-125a-account-create-update-bd2q8" event={"ID":"a3d903f3-8f86-49e2-848b-4a59a9068b75","Type":"ContainerDied","Data":"1dbe5f8319feef22f1ef43626823510cfe8e71d6a8d49cafca70087ce33b1b60"} Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.403816 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.555479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:13 crc kubenswrapper[4962]: E0220 10:13:13.556301 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:13 crc kubenswrapper[4962]: E0220 10:13:13.556367 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:13 crc kubenswrapper[4962]: E0220 10:13:13.556454 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:15.556417583 +0000 UTC m=+1087.138889439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.659811 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.662639 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.668052 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.765642 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.765719 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.784918 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.786256 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.788885 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.791405 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.795894 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.868323 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.868404 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.869453 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.902419 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") pod \"glance-db-create-2m8r7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.971897 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.972525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.972667 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.972865 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.973655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.973750 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:13 crc kubenswrapper[4962]: I0220 10:13:13.977881 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm" (OuterVolumeSpecName: "kube-api-access-cn4cm") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5"). InnerVolumeSpecName "kube-api-access-cn4cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.008059 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.016793 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config" (OuterVolumeSpecName: "config") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.017283 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: E0220 10:13:14.017436 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb podName:736ba007-2c6d-4f91-ae26-16ce53c580c5 nodeName:}" failed. No retries permitted until 2026-02-20 10:13:14.517405519 +0000 UTC m=+1086.099877365 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5") : error deleting /var/lib/kubelet/pods/736ba007-2c6d-4f91-ae26-16ce53c580c5/volume-subpaths: remove /var/lib/kubelet/pods/736ba007-2c6d-4f91-ae26-16ce53c580c5/volume-subpaths: no such file or directory Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.075832 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.076070 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.076327 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.076342 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn4cm\" (UniqueName: \"kubernetes.io/projected/736ba007-2c6d-4f91-ae26-16ce53c580c5-kube-api-access-cn4cm\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.076356 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.078167 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.099965 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") pod \"glance-7588-account-create-update-6ttfz\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.104918 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.324572 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerStarted","Data":"025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3"} Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.326516 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.339809 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.340437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-mqxqd" event={"ID":"736ba007-2c6d-4f91-ae26-16ce53c580c5","Type":"ContainerDied","Data":"bdc0784e8ac6a8e38cc361b433d0c6167f165dee537a2968d10b45106c2fa62c"} Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.340491 4962 scope.go:117] "RemoveContainer" containerID="060fcc52874b30c1462aa08598659a751fb930b9fa898ee5d55f34214c6442bd" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.374914 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" podStartSLOduration=4.374880611 podStartE2EDuration="4.374880611s" podCreationTimestamp="2026-02-20 10:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:13:14.34629082 +0000 UTC m=+1085.928762666" watchObservedRunningTime="2026-02-20 10:13:14.374880611 +0000 UTC m=+1085.957352457" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.397476 4962 scope.go:117] "RemoveContainer" containerID="8b1c406d73d48e9a02cd34de0f3b729ea2c65e81565abd3265c085cac257a091" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.513517 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:13:14 crc kubenswrapper[4962]: W0220 10:13:14.527275 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0275d40a_1206_4eb2_96c8_6c516c57bed7.slice/crio-49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765 WatchSource:0}: Error finding container 49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765: Status 404 returned error can't find the container with id 49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765 Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.589861 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") pod \"736ba007-2c6d-4f91-ae26-16ce53c580c5\" (UID: \"736ba007-2c6d-4f91-ae26-16ce53c580c5\") " Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.591038 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.591266 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "736ba007-2c6d-4f91-ae26-16ce53c580c5" (UID: "736ba007-2c6d-4f91-ae26-16ce53c580c5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.703087 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/736ba007-2c6d-4f91-ae26-16ce53c580c5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.712313 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.722971 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-mqxqd"] Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.837503 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.918571 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") pod \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.918934 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") pod \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\" (UID: \"cb73a133-7ca1-492e-ac32-fb33d6c335ba\") " Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.920940 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb73a133-7ca1-492e-ac32-fb33d6c335ba" (UID: "cb73a133-7ca1-492e-ac32-fb33d6c335ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:14 crc kubenswrapper[4962]: I0220 10:13:14.933570 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr" (OuterVolumeSpecName: "kube-api-access-6p7fr") pod "cb73a133-7ca1-492e-ac32-fb33d6c335ba" (UID: "cb73a133-7ca1-492e-ac32-fb33d6c335ba"). InnerVolumeSpecName "kube-api-access-6p7fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.021108 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p7fr\" (UniqueName: \"kubernetes.io/projected/cb73a133-7ca1-492e-ac32-fb33d6c335ba-kube-api-access-6p7fr\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.021144 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb73a133-7ca1-492e-ac32-fb33d6c335ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.153090 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" path="/var/lib/kubelet/pods/736ba007-2c6d-4f91-ae26-16ce53c580c5/volumes" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.401827 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7588-account-create-update-6ttfz" event={"ID":"afbf9dd3-3bb5-4908-aad0-d06f09946e17","Type":"ContainerStarted","Data":"801e7d436fb1225d117310066525072e6dde83be410ac37d92ea2f315f19006b"} Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.404180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-48c8f" event={"ID":"cb73a133-7ca1-492e-ac32-fb33d6c335ba","Type":"ContainerDied","Data":"f7f2e02b137d5913310205dbb052550410f12f1f612581277520789b3dc50ed0"} Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.404379 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7f2e02b137d5913310205dbb052550410f12f1f612581277520789b3dc50ed0" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.404425 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-48c8f" Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.406331 4962 generic.go:334] "Generic (PLEG): container finished" podID="0275d40a-1206-4eb2-96c8-6c516c57bed7" containerID="1933a4410cc57079acebbf3cca845c0c1a3c75df94daefc5b4a3cc61d913faab" exitCode=0 Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.406513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2m8r7" event={"ID":"0275d40a-1206-4eb2-96c8-6c516c57bed7","Type":"ContainerDied","Data":"1933a4410cc57079acebbf3cca845c0c1a3c75df94daefc5b4a3cc61d913faab"} Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.407497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2m8r7" event={"ID":"0275d40a-1206-4eb2-96c8-6c516c57bed7","Type":"ContainerStarted","Data":"49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765"} Feb 20 10:13:15 crc kubenswrapper[4962]: I0220 10:13:15.635033 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:15 crc kubenswrapper[4962]: E0220 10:13:15.635436 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:15 crc kubenswrapper[4962]: E0220 10:13:15.635462 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:15 crc kubenswrapper[4962]: E0220 10:13:15.635532 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:19.635512688 +0000 UTC m=+1091.217984534 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.447501 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-125a-account-create-update-bd2q8" event={"ID":"a3d903f3-8f86-49e2-848b-4a59a9068b75","Type":"ContainerDied","Data":"c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.450493 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7f927a068005cee1578133484cff3821da23535f70c03eaefda6638fc1c3c4b" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.453069 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2m8r7" event={"ID":"0275d40a-1206-4eb2-96c8-6c516c57bed7","Type":"ContainerDied","Data":"49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.453119 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49f55fa13be93f1cafebedf76c4bc6f53029368ccd1c19617c2de8864d102765" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.455230 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-svsfg" event={"ID":"9c97128d-8360-482e-b05b-6025d046c122","Type":"ContainerDied","Data":"5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.455280 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ae2f4caa17d365e36d41ef1e1e74a3badc6f13ea2ed79b58f7b208327c69c36" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.458076 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8b3-account-create-update-br2xj" event={"ID":"7c7420bd-d4ef-4511-acf4-a132ad0a5677","Type":"ContainerDied","Data":"b2785a54fb58f00cffa05ba7e64b052a132025e8e4e4c971af47565aa7808a85"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.458115 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2785a54fb58f00cffa05ba7e64b052a132025e8e4e4c971af47565aa7808a85" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.464486 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zfmzb" event={"ID":"2b915fcc-cf15-43c3-97c6-bde3a29da796","Type":"ContainerDied","Data":"b6c450bcf92382b603afb108ffce51277656e7c13205fe6efdfb5f56cb4f3fad"} Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.464525 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6c450bcf92382b603afb108ffce51277656e7c13205fe6efdfb5f56cb4f3fad" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.507087 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.554709 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.557536 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.593862 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.603200 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605600 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") pod \"9c97128d-8360-482e-b05b-6025d046c122\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") pod \"a3d903f3-8f86-49e2-848b-4a59a9068b75\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605780 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") pod \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605816 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") pod \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\" (UID: \"7c7420bd-d4ef-4511-acf4-a132ad0a5677\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605888 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") pod \"9c97128d-8360-482e-b05b-6025d046c122\" (UID: \"9c97128d-8360-482e-b05b-6025d046c122\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.605945 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") pod \"a3d903f3-8f86-49e2-848b-4a59a9068b75\" (UID: \"a3d903f3-8f86-49e2-848b-4a59a9068b75\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.608610 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c7420bd-d4ef-4511-acf4-a132ad0a5677" (UID: "7c7420bd-d4ef-4511-acf4-a132ad0a5677"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.609533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c97128d-8360-482e-b05b-6025d046c122" (UID: "9c97128d-8360-482e-b05b-6025d046c122"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.609847 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3d903f3-8f86-49e2-848b-4a59a9068b75" (UID: "a3d903f3-8f86-49e2-848b-4a59a9068b75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.616340 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh" (OuterVolumeSpecName: "kube-api-access-c76wh") pod "9c97128d-8360-482e-b05b-6025d046c122" (UID: "9c97128d-8360-482e-b05b-6025d046c122"). InnerVolumeSpecName "kube-api-access-c76wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.622091 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26" (OuterVolumeSpecName: "kube-api-access-48t26") pod "7c7420bd-d4ef-4511-acf4-a132ad0a5677" (UID: "7c7420bd-d4ef-4511-acf4-a132ad0a5677"). InnerVolumeSpecName "kube-api-access-48t26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.622145 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft" (OuterVolumeSpecName: "kube-api-access-rqhft") pod "a3d903f3-8f86-49e2-848b-4a59a9068b75" (UID: "a3d903f3-8f86-49e2-848b-4a59a9068b75"). InnerVolumeSpecName "kube-api-access-rqhft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.707859 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") pod \"0275d40a-1206-4eb2-96c8-6c516c57bed7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708051 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") pod \"0275d40a-1206-4eb2-96c8-6c516c57bed7\" (UID: \"0275d40a-1206-4eb2-96c8-6c516c57bed7\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708094 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") pod \"2b915fcc-cf15-43c3-97c6-bde3a29da796\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708241 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") pod \"2b915fcc-cf15-43c3-97c6-bde3a29da796\" (UID: \"2b915fcc-cf15-43c3-97c6-bde3a29da796\") " Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708862 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqhft\" (UniqueName: \"kubernetes.io/projected/a3d903f3-8f86-49e2-848b-4a59a9068b75-kube-api-access-rqhft\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708881 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c76wh\" (UniqueName: \"kubernetes.io/projected/9c97128d-8360-482e-b05b-6025d046c122-kube-api-access-c76wh\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708895 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3d903f3-8f86-49e2-848b-4a59a9068b75-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708908 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7420bd-d4ef-4511-acf4-a132ad0a5677-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708921 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48t26\" (UniqueName: \"kubernetes.io/projected/7c7420bd-d4ef-4511-acf4-a132ad0a5677-kube-api-access-48t26\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.708931 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c97128d-8360-482e-b05b-6025d046c122-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.709413 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b915fcc-cf15-43c3-97c6-bde3a29da796" (UID: "2b915fcc-cf15-43c3-97c6-bde3a29da796"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.709567 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0275d40a-1206-4eb2-96c8-6c516c57bed7" (UID: "0275d40a-1206-4eb2-96c8-6c516c57bed7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.711808 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf" (OuterVolumeSpecName: "kube-api-access-szvdf") pod "2b915fcc-cf15-43c3-97c6-bde3a29da796" (UID: "2b915fcc-cf15-43c3-97c6-bde3a29da796"). InnerVolumeSpecName "kube-api-access-szvdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.714745 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559" (OuterVolumeSpecName: "kube-api-access-t9559") pod "0275d40a-1206-4eb2-96c8-6c516c57bed7" (UID: "0275d40a-1206-4eb2-96c8-6c516c57bed7"). InnerVolumeSpecName "kube-api-access-t9559". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.810380 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b915fcc-cf15-43c3-97c6-bde3a29da796-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.810423 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9559\" (UniqueName: \"kubernetes.io/projected/0275d40a-1206-4eb2-96c8-6c516c57bed7-kube-api-access-t9559\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.810442 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0275d40a-1206-4eb2-96c8-6c516c57bed7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:17 crc kubenswrapper[4962]: I0220 10:13:17.810455 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szvdf\" (UniqueName: \"kubernetes.io/projected/2b915fcc-cf15-43c3-97c6-bde3a29da796-kube-api-access-szvdf\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.476051 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9mznb" event={"ID":"2e7338a7-4012-439d-b961-6ca0c55dd6e6","Type":"ContainerStarted","Data":"1255947ebb2d1ff7325c767c453081290e37fc7eec685e64c813cb21e269d2c8"} Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.478888 4962 generic.go:334] "Generic (PLEG): container finished" podID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" containerID="0be6bfc0db94e6c57e1c0a4856d3600b1ea4d12d42a32685b52156cacc1224a0" exitCode=0 Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479046 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2m8r7" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479061 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-bd2q8" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479108 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7588-account-create-update-6ttfz" event={"ID":"afbf9dd3-3bb5-4908-aad0-d06f09946e17","Type":"ContainerDied","Data":"0be6bfc0db94e6c57e1c0a4856d3600b1ea4d12d42a32685b52156cacc1224a0"} Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479316 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-svsfg" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479446 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zfmzb" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.479561 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8b3-account-create-update-br2xj" Feb 20 10:13:18 crc kubenswrapper[4962]: I0220 10:13:18.505040 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-9mznb" podStartSLOduration=3.172032485 podStartE2EDuration="7.505018547s" podCreationTimestamp="2026-02-20 10:13:11 +0000 UTC" firstStartedPulling="2026-02-20 10:13:12.903593124 +0000 UTC m=+1084.486064970" lastFinishedPulling="2026-02-20 10:13:17.236579156 +0000 UTC m=+1088.819051032" observedRunningTime="2026-02-20 10:13:18.499412302 +0000 UTC m=+1090.081884158" watchObservedRunningTime="2026-02-20 10:13:18.505018547 +0000 UTC m=+1090.087490393" Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.659821 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:19 crc kubenswrapper[4962]: E0220 10:13:19.660160 4962 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 10:13:19 crc kubenswrapper[4962]: E0220 10:13:19.660792 4962 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 10:13:19 crc kubenswrapper[4962]: E0220 10:13:19.660890 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift podName:f4fb3b99-0e02-4c5c-9704-884ea3f0605d nodeName:}" failed. No retries permitted until 2026-02-20 10:13:27.66085765 +0000 UTC m=+1099.243329496 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift") pod "swift-storage-0" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d") : configmap "swift-ring-files" not found Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.878024 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.966586 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") pod \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.966655 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") pod \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\" (UID: \"afbf9dd3-3bb5-4908-aad0-d06f09946e17\") " Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.967983 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afbf9dd3-3bb5-4908-aad0-d06f09946e17" (UID: "afbf9dd3-3bb5-4908-aad0-d06f09946e17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:19 crc kubenswrapper[4962]: I0220 10:13:19.976302 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf" (OuterVolumeSpecName: "kube-api-access-7c2xf") pod "afbf9dd3-3bb5-4908-aad0-d06f09946e17" (UID: "afbf9dd3-3bb5-4908-aad0-d06f09946e17"). InnerVolumeSpecName "kube-api-access-7c2xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.069264 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbf9dd3-3bb5-4908-aad0-d06f09946e17-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.069310 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c2xf\" (UniqueName: \"kubernetes.io/projected/afbf9dd3-3bb5-4908-aad0-d06f09946e17-kube-api-access-7c2xf\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.414731 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.423691 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-48c8f"] Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.500328 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7588-account-create-update-6ttfz" event={"ID":"afbf9dd3-3bb5-4908-aad0-d06f09946e17","Type":"ContainerDied","Data":"801e7d436fb1225d117310066525072e6dde83be410ac37d92ea2f315f19006b"} Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.500386 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="801e7d436fb1225d117310066525072e6dde83be410ac37d92ea2f315f19006b" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.500396 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7588-account-create-update-6ttfz" Feb 20 10:13:20 crc kubenswrapper[4962]: I0220 10:13:20.903035 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.002949 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.003266 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="dnsmasq-dns" containerID="cri-o://5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" gracePeriod=10 Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.154684 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" path="/var/lib/kubelet/pods/cb73a133-7ca1-492e-ac32-fb33d6c335ba/volumes" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.505219 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511209 4962 generic.go:334] "Generic (PLEG): container finished" podID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerID="5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" exitCode=0 Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerDied","Data":"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80"} Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511290 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" event={"ID":"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5","Type":"ContainerDied","Data":"5a4113e8006a84520a74694b80780b48a9159ec2ba04b9aa6174205d45e900e7"} Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511310 4962 scope.go:117] "RemoveContainer" containerID="5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.511471 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-ms8hz" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.546780 4962 scope.go:117] "RemoveContainer" containerID="d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.566413 4962 scope.go:117] "RemoveContainer" containerID="5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" Feb 20 10:13:21 crc kubenswrapper[4962]: E0220 10:13:21.567024 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80\": container with ID starting with 5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80 not found: ID does not exist" containerID="5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.567083 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80"} err="failed to get container status \"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80\": rpc error: code = NotFound desc = could not find container \"5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80\": container with ID starting with 5fa35181b36d549c92399e4d9df6d6ad79f748ec903177e753c1eff619aaec80 not found: ID does not exist" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.567115 4962 scope.go:117] "RemoveContainer" containerID="d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb" Feb 20 10:13:21 crc kubenswrapper[4962]: E0220 10:13:21.567698 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb\": container with ID starting with d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb not found: ID does not exist" containerID="d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.567751 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb"} err="failed to get container status \"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb\": rpc error: code = NotFound desc = could not find container \"d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb\": container with ID starting with d8910200dbf0649ee3137635f975af6912b17c314aea8ebff32d3a3379bf2ceb not found: ID does not exist" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600850 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600896 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600927 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.600953 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") pod \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\" (UID: \"764e4dd0-33ca-4ee6-88f3-b981dd49a5b5\") " Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.607172 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l" (OuterVolumeSpecName: "kube-api-access-fxq9l") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "kube-api-access-fxq9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.643125 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.653845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config" (OuterVolumeSpecName: "config") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.657075 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.661526 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" (UID: "764e4dd0-33ca-4ee6-88f3-b981dd49a5b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703568 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxq9l\" (UniqueName: \"kubernetes.io/projected/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-kube-api-access-fxq9l\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703638 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703651 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703661 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.703671 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.865021 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:21 crc kubenswrapper[4962]: I0220 10:13:21.877178 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-ms8hz"] Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.157101 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" path="/var/lib/kubelet/pods/764e4dd0-33ca-4ee6-88f3-b981dd49a5b5/volumes" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.952694 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953405 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953445 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953463 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d903f3-8f86-49e2-848b-4a59a9068b75" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953472 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d903f3-8f86-49e2-848b-4a59a9068b75" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953486 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953495 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953526 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953537 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953552 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c97128d-8360-482e-b05b-6025d046c122" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953560 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c97128d-8360-482e-b05b-6025d046c122" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953577 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953586 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953614 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b915fcc-cf15-43c3-97c6-bde3a29da796" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953623 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b915fcc-cf15-43c3-97c6-bde3a29da796" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953641 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="init" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953650 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="init" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953674 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="init" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953683 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="init" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953705 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0275d40a-1206-4eb2-96c8-6c516c57bed7" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953714 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0275d40a-1206-4eb2-96c8-6c516c57bed7" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: E0220 10:13:23.953738 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953747 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953970 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0275d40a-1206-4eb2-96c8-6c516c57bed7" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.953989 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954004 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b915fcc-cf15-43c3-97c6-bde3a29da796" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954015 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954046 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d903f3-8f86-49e2-848b-4a59a9068b75" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954067 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="764e4dd0-33ca-4ee6-88f3-b981dd49a5b5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954079 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb73a133-7ca1-492e-ac32-fb33d6c335ba" containerName="mariadb-account-create-update" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954091 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c97128d-8360-482e-b05b-6025d046c122" containerName="mariadb-database-create" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954104 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="736ba007-2c6d-4f91-ae26-16ce53c580c5" containerName="dnsmasq-dns" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.954891 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.961212 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.961231 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.963504 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lzhn7" Feb 20 10:13:23 crc kubenswrapper[4962]: I0220 10:13:23.970841 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.052306 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.052458 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.052790 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.052944 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.154645 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.154715 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.154802 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.154863 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.162456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.162829 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.169135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.179245 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") pod \"glance-db-sync-9gcrq\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.275033 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.549354 4962 generic.go:334] "Generic (PLEG): container finished" podID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" containerID="1255947ebb2d1ff7325c767c453081290e37fc7eec685e64c813cb21e269d2c8" exitCode=0 Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.549529 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9mznb" event={"ID":"2e7338a7-4012-439d-b961-6ca0c55dd6e6","Type":"ContainerDied","Data":"1255947ebb2d1ff7325c767c453081290e37fc7eec685e64c813cb21e269d2c8"} Feb 20 10:13:24 crc kubenswrapper[4962]: I0220 10:13:24.679839 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.399028 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.400397 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.404672 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.418146 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.498814 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.499217 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.563198 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9gcrq" event={"ID":"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1","Type":"ContainerStarted","Data":"a8e14a05cffa52a7e3ad38a2be0cb8a03501b42d139fbf401ec8ecee6a2bd2a6"} Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.601472 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.601681 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.602432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.628102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") pod \"root-account-create-update-ptczd\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.771959 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:25 crc kubenswrapper[4962]: I0220 10:13:25.904885 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007272 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007336 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007402 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007429 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007498 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007542 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.007630 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") pod \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\" (UID: \"2e7338a7-4012-439d-b961-6ca0c55dd6e6\") " Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.009664 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.009674 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.027482 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.036475 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5" (OuterVolumeSpecName: "kube-api-access-dnmc5") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "kube-api-access-dnmc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.036568 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.048786 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.048920 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts" (OuterVolumeSpecName: "scripts") pod "2e7338a7-4012-439d-b961-6ca0c55dd6e6" (UID: "2e7338a7-4012-439d-b961-6ca0c55dd6e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110115 4962 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110155 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e7338a7-4012-439d-b961-6ca0c55dd6e6-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110167 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110180 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnmc5\" (UniqueName: \"kubernetes.io/projected/2e7338a7-4012-439d-b961-6ca0c55dd6e6-kube-api-access-dnmc5\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110192 4962 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110200 4962 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e7338a7-4012-439d-b961-6ca0c55dd6e6-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.110210 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e7338a7-4012-439d-b961-6ca0c55dd6e6-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.245671 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:13:26 crc kubenswrapper[4962]: W0220 10:13:26.254431 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod598e051e_58af_4a1a_aa46_7f88d635f34c.slice/crio-3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610 WatchSource:0}: Error finding container 3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610: Status 404 returned error can't find the container with id 3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610 Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.593442 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9mznb" event={"ID":"2e7338a7-4012-439d-b961-6ca0c55dd6e6","Type":"ContainerDied","Data":"8c010bf1bf294a17a678e18297fc5c3ac174eb1389754dd15038dc4e26f7804b"} Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.593497 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c010bf1bf294a17a678e18297fc5c3ac174eb1389754dd15038dc4e26f7804b" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.593511 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9mznb" Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.598391 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ptczd" event={"ID":"598e051e-58af-4a1a-aa46-7f88d635f34c","Type":"ContainerStarted","Data":"1374063c1227f074525aeab9310be0405d817c53450d0331b45011e3f7fb82f7"} Feb 20 10:13:26 crc kubenswrapper[4962]: I0220 10:13:26.598860 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ptczd" event={"ID":"598e051e-58af-4a1a-aa46-7f88d635f34c","Type":"ContainerStarted","Data":"3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610"} Feb 20 10:13:27 crc kubenswrapper[4962]: I0220 10:13:27.609729 4962 generic.go:334] "Generic (PLEG): container finished" podID="598e051e-58af-4a1a-aa46-7f88d635f34c" containerID="1374063c1227f074525aeab9310be0405d817c53450d0331b45011e3f7fb82f7" exitCode=0 Feb 20 10:13:27 crc kubenswrapper[4962]: I0220 10:13:27.609788 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ptczd" event={"ID":"598e051e-58af-4a1a-aa46-7f88d635f34c","Type":"ContainerDied","Data":"1374063c1227f074525aeab9310be0405d817c53450d0331b45011e3f7fb82f7"} Feb 20 10:13:27 crc kubenswrapper[4962]: I0220 10:13:27.745004 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:27 crc kubenswrapper[4962]: I0220 10:13:27.752231 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"swift-storage-0\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " pod="openstack/swift-storage-0" Feb 20 10:13:28 crc kubenswrapper[4962]: I0220 10:13:28.004037 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 10:13:28 crc kubenswrapper[4962]: I0220 10:13:28.579727 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:13:28 crc kubenswrapper[4962]: W0220 10:13:28.586982 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4fb3b99_0e02_4c5c_9704_884ea3f0605d.slice/crio-7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5 WatchSource:0}: Error finding container 7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5: Status 404 returned error can't find the container with id 7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5 Feb 20 10:13:28 crc kubenswrapper[4962]: I0220 10:13:28.619215 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5"} Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.026898 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.195112 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") pod \"598e051e-58af-4a1a-aa46-7f88d635f34c\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.195255 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") pod \"598e051e-58af-4a1a-aa46-7f88d635f34c\" (UID: \"598e051e-58af-4a1a-aa46-7f88d635f34c\") " Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.197526 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "598e051e-58af-4a1a-aa46-7f88d635f34c" (UID: "598e051e-58af-4a1a-aa46-7f88d635f34c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.213787 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq" (OuterVolumeSpecName: "kube-api-access-6vsmq") pod "598e051e-58af-4a1a-aa46-7f88d635f34c" (UID: "598e051e-58af-4a1a-aa46-7f88d635f34c"). InnerVolumeSpecName "kube-api-access-6vsmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.233289 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wj9f6" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" probeResult="failure" output=< Feb 20 10:13:29 crc kubenswrapper[4962]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 10:13:29 crc kubenswrapper[4962]: > Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.297674 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/598e051e-58af-4a1a-aa46-7f88d635f34c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.297709 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vsmq\" (UniqueName: \"kubernetes.io/projected/598e051e-58af-4a1a-aa46-7f88d635f34c-kube-api-access-6vsmq\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.479850 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.482240 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.627674 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ptczd" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.627678 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ptczd" event={"ID":"598e051e-58af-4a1a-aa46-7f88d635f34c","Type":"ContainerDied","Data":"3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610"} Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.627737 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aefc1aebec6e659b8988e707d8ac47e1f32a7d6e4f29a80e6b975b171136610" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.722644 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:29 crc kubenswrapper[4962]: E0220 10:13:29.723025 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" containerName="swift-ring-rebalance" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.723047 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" containerName="swift-ring-rebalance" Feb 20 10:13:29 crc kubenswrapper[4962]: E0220 10:13:29.723076 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598e051e-58af-4a1a-aa46-7f88d635f34c" containerName="mariadb-account-create-update" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.723087 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="598e051e-58af-4a1a-aa46-7f88d635f34c" containerName="mariadb-account-create-update" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.723308 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="598e051e-58af-4a1a-aa46-7f88d635f34c" containerName="mariadb-account-create-update" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.723332 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" containerName="swift-ring-rebalance" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.724726 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.727751 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.733666 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814070 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814371 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814735 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.814811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917027 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917091 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917127 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917179 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917206 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917274 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917473 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917487 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.917535 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.918389 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.919667 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:29 crc kubenswrapper[4962]: I0220 10:13:29.935748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") pod \"ovn-controller-wj9f6-config-grwch\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.053454 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.575964 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.639659 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36"} Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.639721 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1"} Feb 20 10:13:30 crc kubenswrapper[4962]: I0220 10:13:30.640924 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6-config-grwch" event={"ID":"b4733c39-1a37-4a56-a731-88fcac6da1c0","Type":"ContainerStarted","Data":"2369792fd5085de62a1c8ae711d4a2040dcad9e3ce1de30b07680e2c16a856e9"} Feb 20 10:13:31 crc kubenswrapper[4962]: I0220 10:13:31.652510 4962 generic.go:334] "Generic (PLEG): container finished" podID="b4733c39-1a37-4a56-a731-88fcac6da1c0" containerID="a4453e5e140badbab6aa97996c8ab339f8ab22881b41594395bdb84a3005b466" exitCode=0 Feb 20 10:13:31 crc kubenswrapper[4962]: I0220 10:13:31.652641 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6-config-grwch" event={"ID":"b4733c39-1a37-4a56-a731-88fcac6da1c0","Type":"ContainerDied","Data":"a4453e5e140badbab6aa97996c8ab339f8ab22881b41594395bdb84a3005b466"} Feb 20 10:13:31 crc kubenswrapper[4962]: I0220 10:13:31.661674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c"} Feb 20 10:13:31 crc kubenswrapper[4962]: I0220 10:13:31.661726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601"} Feb 20 10:13:33 crc kubenswrapper[4962]: I0220 10:13:33.686047 4962 generic.go:334] "Generic (PLEG): container finished" podID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerID="1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e" exitCode=0 Feb 20 10:13:33 crc kubenswrapper[4962]: I0220 10:13:33.686135 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerDied","Data":"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e"} Feb 20 10:13:34 crc kubenswrapper[4962]: I0220 10:13:34.211815 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wj9f6" Feb 20 10:13:34 crc kubenswrapper[4962]: I0220 10:13:34.705511 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerDied","Data":"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857"} Feb 20 10:13:34 crc kubenswrapper[4962]: I0220 10:13:34.705252 4962 generic.go:334] "Generic (PLEG): container finished" podID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerID="565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857" exitCode=0 Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.260540 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.329928 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.329991 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.330031 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.330058 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.330200 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.330358 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") pod \"b4733c39-1a37-4a56-a731-88fcac6da1c0\" (UID: \"b4733c39-1a37-4a56-a731-88fcac6da1c0\") " Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.331425 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.331533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run" (OuterVolumeSpecName: "var-run") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.331653 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.332192 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts" (OuterVolumeSpecName: "scripts") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.332388 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.336695 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.337858 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.337966 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.338031 4962 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4733c39-1a37-4a56-a731-88fcac6da1c0-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.338101 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4733c39-1a37-4a56-a731-88fcac6da1c0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.348782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9" (OuterVolumeSpecName: "kube-api-access-hzdf9") pod "b4733c39-1a37-4a56-a731-88fcac6da1c0" (UID: "b4733c39-1a37-4a56-a731-88fcac6da1c0"). InnerVolumeSpecName "kube-api-access-hzdf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.441208 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzdf9\" (UniqueName: \"kubernetes.io/projected/b4733c39-1a37-4a56-a731-88fcac6da1c0-kube-api-access-hzdf9\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.793635 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerStarted","Data":"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308"} Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.794849 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.810155 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6-config-grwch" event={"ID":"b4733c39-1a37-4a56-a731-88fcac6da1c0","Type":"ContainerDied","Data":"2369792fd5085de62a1c8ae711d4a2040dcad9e3ce1de30b07680e2c16a856e9"} Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.810202 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2369792fd5085de62a1c8ae711d4a2040dcad9e3ce1de30b07680e2c16a856e9" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.810290 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6-config-grwch" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.824345 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerStarted","Data":"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718"} Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.824673 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.842506 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.344408399 podStartE2EDuration="1m16.842486749s" podCreationTimestamp="2026-02-20 10:12:23 +0000 UTC" firstStartedPulling="2026-02-20 10:12:25.092635469 +0000 UTC m=+1036.675107315" lastFinishedPulling="2026-02-20 10:12:59.590713809 +0000 UTC m=+1071.173185665" observedRunningTime="2026-02-20 10:13:39.838736302 +0000 UTC m=+1111.421208148" watchObservedRunningTime="2026-02-20 10:13:39.842486749 +0000 UTC m=+1111.424958595" Feb 20 10:13:39 crc kubenswrapper[4962]: I0220 10:13:39.905582 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.281711095 podStartE2EDuration="1m16.905556891s" podCreationTimestamp="2026-02-20 10:12:23 +0000 UTC" firstStartedPulling="2026-02-20 10:12:25.969312919 +0000 UTC m=+1037.551784765" lastFinishedPulling="2026-02-20 10:12:59.593158705 +0000 UTC m=+1071.175630561" observedRunningTime="2026-02-20 10:13:39.89974156 +0000 UTC m=+1111.482213406" watchObservedRunningTime="2026-02-20 10:13:39.905556891 +0000 UTC m=+1111.488028737" Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.402134 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.411560 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wj9f6-config-grwch"] Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.840666 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca"} Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.841221 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8"} Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.841240 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8"} Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.842861 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9gcrq" event={"ID":"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1","Type":"ContainerStarted","Data":"ae355b88f320e93105b216772d0d1821b9792d4ee89d86649fd430b7ae19d59e"} Feb 20 10:13:40 crc kubenswrapper[4962]: I0220 10:13:40.866423 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9gcrq" podStartSLOduration=3.385041449 podStartE2EDuration="17.866395147s" podCreationTimestamp="2026-02-20 10:13:23 +0000 UTC" firstStartedPulling="2026-02-20 10:13:24.692980575 +0000 UTC m=+1096.275452411" lastFinishedPulling="2026-02-20 10:13:39.174334253 +0000 UTC m=+1110.756806109" observedRunningTime="2026-02-20 10:13:40.863978762 +0000 UTC m=+1112.446450608" watchObservedRunningTime="2026-02-20 10:13:40.866395147 +0000 UTC m=+1112.448866993" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.153829 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4733c39-1a37-4a56-a731-88fcac6da1c0" path="/var/lib/kubelet/pods/b4733c39-1a37-4a56-a731-88fcac6da1c0/volumes" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.507903 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.507991 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.508050 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.509025 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.509089 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1" gracePeriod=600 Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.855835 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1" exitCode=0 Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.856322 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1"} Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.856379 4962 scope.go:117] "RemoveContainer" containerID="00c783abd2aaed9d0c1eb9c41c798ffe19fb999487c2907db1de61e5a49afcce" Feb 20 10:13:41 crc kubenswrapper[4962]: I0220 10:13:41.863470 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0"} Feb 20 10:13:42 crc kubenswrapper[4962]: I0220 10:13:42.874704 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716"} Feb 20 10:13:43 crc kubenswrapper[4962]: I0220 10:13:43.891807 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff"} Feb 20 10:13:43 crc kubenswrapper[4962]: I0220 10:13:43.892388 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.907809 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.908746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.908773 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.908795 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.908812 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerStarted","Data":"3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca"} Feb 20 10:13:44 crc kubenswrapper[4962]: I0220 10:13:44.956527 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.574923872 podStartE2EDuration="34.956498348s" podCreationTimestamp="2026-02-20 10:13:10 +0000 UTC" firstStartedPulling="2026-02-20 10:13:28.589434424 +0000 UTC m=+1100.171906260" lastFinishedPulling="2026-02-20 10:13:42.97100887 +0000 UTC m=+1114.553480736" observedRunningTime="2026-02-20 10:13:44.95009829 +0000 UTC m=+1116.532570136" watchObservedRunningTime="2026-02-20 10:13:44.956498348 +0000 UTC m=+1116.538970204" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.263581 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:45 crc kubenswrapper[4962]: E0220 10:13:45.264131 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4733c39-1a37-4a56-a731-88fcac6da1c0" containerName="ovn-config" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.264155 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4733c39-1a37-4a56-a731-88fcac6da1c0" containerName="ovn-config" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.264363 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4733c39-1a37-4a56-a731-88fcac6da1c0" containerName="ovn-config" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.265554 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.267983 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.287393 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.383084 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.383178 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.383211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.383900 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.384182 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.384462 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.485933 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486004 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486124 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486154 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.486182 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.487123 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.487279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.487745 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.487975 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.488007 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.513686 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") pod \"dnsmasq-dns-84f584987c-2gm7k\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:45 crc kubenswrapper[4962]: I0220 10:13:45.586149 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:46 crc kubenswrapper[4962]: I0220 10:13:46.116161 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:46 crc kubenswrapper[4962]: I0220 10:13:46.928718 4962 generic.go:334] "Generic (PLEG): container finished" podID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerID="584310e205b8e31a078417aa3e179930fe706052662212b3eaf969a39ad7c786" exitCode=0 Feb 20 10:13:46 crc kubenswrapper[4962]: I0220 10:13:46.928852 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerDied","Data":"584310e205b8e31a078417aa3e179930fe706052662212b3eaf969a39ad7c786"} Feb 20 10:13:46 crc kubenswrapper[4962]: I0220 10:13:46.929516 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerStarted","Data":"6231efbcfe405ddd4c89430a5b15578c84264d02a7e2d321eede6e43d22dfa60"} Feb 20 10:13:47 crc kubenswrapper[4962]: I0220 10:13:47.942896 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerStarted","Data":"a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4"} Feb 20 10:13:47 crc kubenswrapper[4962]: I0220 10:13:47.943407 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:47 crc kubenswrapper[4962]: I0220 10:13:47.973252 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" podStartSLOduration=2.973229483 podStartE2EDuration="2.973229483s" podCreationTimestamp="2026-02-20 10:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:13:47.967488255 +0000 UTC m=+1119.549960101" watchObservedRunningTime="2026-02-20 10:13:47.973229483 +0000 UTC m=+1119.555701319" Feb 20 10:13:48 crc kubenswrapper[4962]: I0220 10:13:48.953952 4962 generic.go:334] "Generic (PLEG): container finished" podID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" containerID="ae355b88f320e93105b216772d0d1821b9792d4ee89d86649fd430b7ae19d59e" exitCode=0 Feb 20 10:13:48 crc kubenswrapper[4962]: I0220 10:13:48.954062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9gcrq" event={"ID":"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1","Type":"ContainerDied","Data":"ae355b88f320e93105b216772d0d1821b9792d4ee89d86649fd430b7ae19d59e"} Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.781310 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.816462 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") pod \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.816669 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") pod \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.816774 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") pod \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.816804 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") pod \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\" (UID: \"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1\") " Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.825731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" (UID: "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.826046 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl" (OuterVolumeSpecName: "kube-api-access-dqljl") pod "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" (UID: "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1"). InnerVolumeSpecName "kube-api-access-dqljl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.847554 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" (UID: "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.869470 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data" (OuterVolumeSpecName: "config-data") pod "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" (UID: "3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.919923 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.919959 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.919974 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.919984 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqljl\" (UniqueName: \"kubernetes.io/projected/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1-kube-api-access-dqljl\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.981815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9gcrq" event={"ID":"3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1","Type":"ContainerDied","Data":"a8e14a05cffa52a7e3ad38a2be0cb8a03501b42d139fbf401ec8ecee6a2bd2a6"} Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.981880 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8e14a05cffa52a7e3ad38a2be0cb8a03501b42d139fbf401ec8ecee6a2bd2a6" Feb 20 10:13:50 crc kubenswrapper[4962]: I0220 10:13:50.981903 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9gcrq" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.447200 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.454345 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="dnsmasq-dns" containerID="cri-o://a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4" gracePeriod=10 Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.503960 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:13:51 crc kubenswrapper[4962]: E0220 10:13:51.504378 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" containerName="glance-db-sync" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.504392 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" containerName="glance-db-sync" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.504582 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" containerName="glance-db-sync" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.505724 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.529650 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534272 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534311 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534355 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.534432 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636319 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636406 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636434 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636479 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.636556 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.637846 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.637915 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.637946 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.638041 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.638466 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.660858 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") pod \"dnsmasq-dns-69577ff67f-kvhqf\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:51 crc kubenswrapper[4962]: I0220 10:13:51.828828 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:52 crc kubenswrapper[4962]: I0220 10:13:52.004768 4962 generic.go:334] "Generic (PLEG): container finished" podID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerID="a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4" exitCode=0 Feb 20 10:13:52 crc kubenswrapper[4962]: I0220 10:13:52.004955 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerDied","Data":"a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4"} Feb 20 10:13:52 crc kubenswrapper[4962]: I0220 10:13:52.814262 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:13:52 crc kubenswrapper[4962]: W0220 10:13:52.825454 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cdf678_dd6c_4f3b_a675_4803eddcfc44.slice/crio-83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879 WatchSource:0}: Error finding container 83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879: Status 404 returned error can't find the container with id 83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879 Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.015449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerStarted","Data":"83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879"} Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.505850 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608513 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608682 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608754 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608836 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608857 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.608886 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") pod \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\" (UID: \"89a12a35-60fb-43fc-bd27-d7db10bc1aaa\") " Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.618402 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m" (OuterVolumeSpecName: "kube-api-access-ht65m") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "kube-api-access-ht65m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.667764 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.670700 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config" (OuterVolumeSpecName: "config") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.674808 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.678415 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.703074 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "89a12a35-60fb-43fc-bd27-d7db10bc1aaa" (UID: "89a12a35-60fb-43fc-bd27-d7db10bc1aaa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710023 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710047 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710060 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht65m\" (UniqueName: \"kubernetes.io/projected/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-kube-api-access-ht65m\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710071 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710081 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:53 crc kubenswrapper[4962]: I0220 10:13:53.710091 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a12a35-60fb-43fc-bd27-d7db10bc1aaa-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.029723 4962 generic.go:334] "Generic (PLEG): container finished" podID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerID="c3f233006bdf1d16d8946733067213908be75ed885abe76b0ea0e53fac4b17ed" exitCode=0 Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.029858 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerDied","Data":"c3f233006bdf1d16d8946733067213908be75ed885abe76b0ea0e53fac4b17ed"} Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.032680 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" event={"ID":"89a12a35-60fb-43fc-bd27-d7db10bc1aaa","Type":"ContainerDied","Data":"6231efbcfe405ddd4c89430a5b15578c84264d02a7e2d321eede6e43d22dfa60"} Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.032763 4962 scope.go:117] "RemoveContainer" containerID="a007efc5f3ae3abc9fcc5e604d38f4026fe469792807759cefec37856ad251e4" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.032768 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f584987c-2gm7k" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.221297 4962 scope.go:117] "RemoveContainer" containerID="584310e205b8e31a078417aa3e179930fe706052662212b3eaf969a39ad7c786" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.257033 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.264406 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84f584987c-2gm7k"] Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.578040 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.921894 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:13:54 crc kubenswrapper[4962]: E0220 10:13:54.922274 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="dnsmasq-dns" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.922293 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="dnsmasq-dns" Feb 20 10:13:54 crc kubenswrapper[4962]: E0220 10:13:54.922328 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="init" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.922336 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="init" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.922499 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" containerName="dnsmasq-dns" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.923237 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:54 crc kubenswrapper[4962]: I0220 10:13:54.948972 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.038391 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.038608 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.039537 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.040700 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.042821 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerStarted","Data":"e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab"} Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.043325 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.051947 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.059212 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.096031 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podStartSLOduration=4.096005524 podStartE2EDuration="4.096005524s" podCreationTimestamp="2026-02-20 10:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:13:55.094353442 +0000 UTC m=+1126.676825288" watchObservedRunningTime="2026-02-20 10:13:55.096005524 +0000 UTC m=+1126.678477370" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.140838 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.141015 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.141059 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.141088 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.142815 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.154089 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a12a35-60fb-43fc-bd27-d7db10bc1aaa" path="/var/lib/kubelet/pods/89a12a35-60fb-43fc-bd27-d7db10bc1aaa/volumes" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.160947 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") pod \"cinder-db-create-h5ptn\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.242759 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.242849 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.243943 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.248921 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.272345 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") pod \"cinder-11b3-account-create-update-x5n92\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.284068 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.287587 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.290635 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.291400 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-76ldt" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.292265 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.292444 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.301402 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.329826 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.344796 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.344839 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.344967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.358165 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.359332 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.360955 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.411763 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.473443 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.473844 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.473871 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.489447 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.491448 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.493848 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.503869 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.513155 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.526361 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") pod \"keystone-db-sync-m26vd\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.528868 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.542051 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.549948 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.550175 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577378 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577497 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577554 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577582 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.577703 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.644234 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.645452 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.651844 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.670376 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.674970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m26vd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683770 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683812 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683854 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683879 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683905 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.683992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.684013 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.684724 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.685216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:55 crc kubenswrapper[4962]: I0220 10:13:55.685692 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.716912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") pod \"barbican-c46d-account-create-update-44g6w\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.724813 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") pod \"barbican-db-create-758kd\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " pod="openstack/barbican-db-create-758kd" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.733116 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") pod \"neutron-db-create-4hwp2\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.765052 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-758kd" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.785912 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.785986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.787035 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.842136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.843503 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") pod \"neutron-695f-account-create-update-22t44\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.884291 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4hwp2" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:55.971168 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:56.130856 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:13:56 crc kubenswrapper[4962]: W0220 10:13:56.179696 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode761565e_55de_43bc_b82d_95b776652b5c.slice/crio-dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef WatchSource:0}: Error finding container dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef: Status 404 returned error can't find the container with id dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:56.936723 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:13:56 crc kubenswrapper[4962]: I0220 10:13:56.954945 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.127531 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:13:57 crc kubenswrapper[4962]: W0220 10:13:57.145809 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21296df9_6e67_4427_959d_8d67bfd1393b.slice/crio-afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3 WatchSource:0}: Error finding container afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3: Status 404 returned error can't find the container with id afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3 Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.158370 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.173051 4962 generic.go:334] "Generic (PLEG): container finished" podID="e761565e-55de-43bc-b82d-95b776652b5c" containerID="53831e942d8d69707dcfe40655e43c5762a4d492f07b1c79ed7f413953ec5f61" exitCode=0 Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.173119 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5ptn" event={"ID":"e761565e-55de-43bc-b82d-95b776652b5c","Type":"ContainerDied","Data":"53831e942d8d69707dcfe40655e43c5762a4d492f07b1c79ed7f413953ec5f61"} Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.173195 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5ptn" event={"ID":"e761565e-55de-43bc-b82d-95b776652b5c","Type":"ContainerStarted","Data":"dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef"} Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.180150 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.201205 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m26vd" event={"ID":"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408","Type":"ContainerStarted","Data":"d9fb2f47a09b42f7a4e90c5cb0cc07c8e1190c6442bbdd0ca0c2a2429a245afe"} Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.218371 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-44g6w" event={"ID":"7e2005e0-31d4-408f-8c66-187a6dd37bcd","Type":"ContainerStarted","Data":"d1ace52df18655ab33bfdec0b202a45aaa09716c2023e587cd508d5e0ef9db45"} Feb 20 10:13:57 crc kubenswrapper[4962]: I0220 10:13:57.260639 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.227287 4962 generic.go:334] "Generic (PLEG): container finished" podID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" containerID="03ab33469ea979640d7188e1c0dc68dd1548a99d601929f7b4e160bee72396f3" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.227623 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-44g6w" event={"ID":"7e2005e0-31d4-408f-8c66-187a6dd37bcd","Type":"ContainerDied","Data":"03ab33469ea979640d7188e1c0dc68dd1548a99d601929f7b4e160bee72396f3"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.229417 4962 generic.go:334] "Generic (PLEG): container finished" podID="21296df9-6e67-4427-959d-8d67bfd1393b" containerID="2c027b22cf0ba460d458ecf5143a855bd6cabc995b34bcff27678d1a95ac71b9" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.229456 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-11b3-account-create-update-x5n92" event={"ID":"21296df9-6e67-4427-959d-8d67bfd1393b","Type":"ContainerDied","Data":"2c027b22cf0ba460d458ecf5143a855bd6cabc995b34bcff27678d1a95ac71b9"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.229472 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-11b3-account-create-update-x5n92" event={"ID":"21296df9-6e67-4427-959d-8d67bfd1393b","Type":"ContainerStarted","Data":"afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.231356 4962 generic.go:334] "Generic (PLEG): container finished" podID="684fc9d7-94f0-418a-b059-e5519e6cd316" containerID="7fd77ce11ed465ec4237e46a1c362e414960d4a8e3a2e89e44d3a98f1d109ea9" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.231392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4hwp2" event={"ID":"684fc9d7-94f0-418a-b059-e5519e6cd316","Type":"ContainerDied","Data":"7fd77ce11ed465ec4237e46a1c362e414960d4a8e3a2e89e44d3a98f1d109ea9"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.231408 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4hwp2" event={"ID":"684fc9d7-94f0-418a-b059-e5519e6cd316","Type":"ContainerStarted","Data":"f577b4be85d2e6d0328c909c0a4b5923c1ddc7a57c38be3bb1161b8b028e1173"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.232929 4962 generic.go:334] "Generic (PLEG): container finished" podID="4feedd65-778f-471c-a2bf-23af2e459685" containerID="5a9782006ca96cee05b8576db8cf67f09117b6ff20027f1e9a751d12df45c5f2" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.232980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695f-account-create-update-22t44" event={"ID":"4feedd65-778f-471c-a2bf-23af2e459685","Type":"ContainerDied","Data":"5a9782006ca96cee05b8576db8cf67f09117b6ff20027f1e9a751d12df45c5f2"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.232996 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695f-account-create-update-22t44" event={"ID":"4feedd65-778f-471c-a2bf-23af2e459685","Type":"ContainerStarted","Data":"b746d0ff7cb0abb3cce635c6667ca98effdafd59e40d357fc879d7e372ceb588"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.236811 4962 generic.go:334] "Generic (PLEG): container finished" podID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" containerID="aa045e6922dfe4d5b86be77916d3a6f56d92ad5d8849a14be83a3fc1d37883cc" exitCode=0 Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.236916 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-758kd" event={"ID":"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e","Type":"ContainerDied","Data":"aa045e6922dfe4d5b86be77916d3a6f56d92ad5d8849a14be83a3fc1d37883cc"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.236995 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-758kd" event={"ID":"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e","Type":"ContainerStarted","Data":"aac5b33561325fc4fe98a641b749b757545fa78da23bf1dbffc4adf1c4229064"} Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.660702 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.781397 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") pod \"e761565e-55de-43bc-b82d-95b776652b5c\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.781824 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") pod \"e761565e-55de-43bc-b82d-95b776652b5c\" (UID: \"e761565e-55de-43bc-b82d-95b776652b5c\") " Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.782652 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e761565e-55de-43bc-b82d-95b776652b5c" (UID: "e761565e-55de-43bc-b82d-95b776652b5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.795953 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d" (OuterVolumeSpecName: "kube-api-access-7987d") pod "e761565e-55de-43bc-b82d-95b776652b5c" (UID: "e761565e-55de-43bc-b82d-95b776652b5c"). InnerVolumeSpecName "kube-api-access-7987d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.885246 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7987d\" (UniqueName: \"kubernetes.io/projected/e761565e-55de-43bc-b82d-95b776652b5c-kube-api-access-7987d\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:58 crc kubenswrapper[4962]: I0220 10:13:58.885351 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e761565e-55de-43bc-b82d-95b776652b5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:13:59 crc kubenswrapper[4962]: I0220 10:13:59.249499 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h5ptn" Feb 20 10:13:59 crc kubenswrapper[4962]: I0220 10:13:59.249488 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h5ptn" event={"ID":"e761565e-55de-43bc-b82d-95b776652b5c","Type":"ContainerDied","Data":"dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef"} Feb 20 10:13:59 crc kubenswrapper[4962]: I0220 10:13:59.249578 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfa515192511c87c5e462757408b13a73befcd3ac84bc3f4e604a1d6f4fb18ef" Feb 20 10:14:01 crc kubenswrapper[4962]: I0220 10:14:01.830770 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:14:01 crc kubenswrapper[4962]: I0220 10:14:01.915192 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:14:01 crc kubenswrapper[4962]: I0220 10:14:01.915622 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="dnsmasq-dns" containerID="cri-o://025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3" gracePeriod=10 Feb 20 10:14:02 crc kubenswrapper[4962]: I0220 10:14:02.289749 4962 generic.go:334] "Generic (PLEG): container finished" podID="ced7b045-00ec-453d-9a56-b13132991e8c" containerID="025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3" exitCode=0 Feb 20 10:14:02 crc kubenswrapper[4962]: I0220 10:14:02.289800 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerDied","Data":"025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.184023 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.290976 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") pod \"21296df9-6e67-4427-959d-8d67bfd1393b\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.291075 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") pod \"21296df9-6e67-4427-959d-8d67bfd1393b\" (UID: \"21296df9-6e67-4427-959d-8d67bfd1393b\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.292236 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21296df9-6e67-4427-959d-8d67bfd1393b" (UID: "21296df9-6e67-4427-959d-8d67bfd1393b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.296055 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45" (OuterVolumeSpecName: "kube-api-access-nbr45") pod "21296df9-6e67-4427-959d-8d67bfd1393b" (UID: "21296df9-6e67-4427-959d-8d67bfd1393b"). InnerVolumeSpecName "kube-api-access-nbr45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.300869 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-44g6w" event={"ID":"7e2005e0-31d4-408f-8c66-187a6dd37bcd","Type":"ContainerDied","Data":"d1ace52df18655ab33bfdec0b202a45aaa09716c2023e587cd508d5e0ef9db45"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.300915 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ace52df18655ab33bfdec0b202a45aaa09716c2023e587cd508d5e0ef9db45" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.303078 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-11b3-account-create-update-x5n92" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.303084 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-11b3-account-create-update-x5n92" event={"ID":"21296df9-6e67-4427-959d-8d67bfd1393b","Type":"ContainerDied","Data":"afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.303140 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afeffb41d4626b6fc252962c171fdb95b3654b10fe283dd82ef1ad599b6a0dc3" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.306488 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4hwp2" event={"ID":"684fc9d7-94f0-418a-b059-e5519e6cd316","Type":"ContainerDied","Data":"f577b4be85d2e6d0328c909c0a4b5923c1ddc7a57c38be3bb1161b8b028e1173"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.306532 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f577b4be85d2e6d0328c909c0a4b5923c1ddc7a57c38be3bb1161b8b028e1173" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.311032 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695f-account-create-update-22t44" event={"ID":"4feedd65-778f-471c-a2bf-23af2e459685","Type":"ContainerDied","Data":"b746d0ff7cb0abb3cce635c6667ca98effdafd59e40d357fc879d7e372ceb588"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.311067 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b746d0ff7cb0abb3cce635c6667ca98effdafd59e40d357fc879d7e372ceb588" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.313684 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-758kd" event={"ID":"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e","Type":"ContainerDied","Data":"aac5b33561325fc4fe98a641b749b757545fa78da23bf1dbffc4adf1c4229064"} Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.313738 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aac5b33561325fc4fe98a641b749b757545fa78da23bf1dbffc4adf1c4229064" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.316476 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.356105 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-758kd" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.385604 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.398384 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbr45\" (UniqueName: \"kubernetes.io/projected/21296df9-6e67-4427-959d-8d67bfd1393b-kube-api-access-nbr45\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.398425 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21296df9-6e67-4427-959d-8d67bfd1393b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.406342 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4hwp2" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.482096 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499272 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") pod \"4feedd65-778f-471c-a2bf-23af2e459685\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499346 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") pod \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499404 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") pod \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499472 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") pod \"4feedd65-778f-471c-a2bf-23af2e459685\" (UID: \"4feedd65-778f-471c-a2bf-23af2e459685\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499500 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") pod \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\" (UID: \"a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.499585 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") pod \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\" (UID: \"7e2005e0-31d4-408f-8c66-187a6dd37bcd\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.501359 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e2005e0-31d4-408f-8c66-187a6dd37bcd" (UID: "7e2005e0-31d4-408f-8c66-187a6dd37bcd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.503734 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4feedd65-778f-471c-a2bf-23af2e459685" (UID: "4feedd65-778f-471c-a2bf-23af2e459685"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.504061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" (UID: "a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.509331 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr" (OuterVolumeSpecName: "kube-api-access-hxrtr") pod "a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" (UID: "a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e"). InnerVolumeSpecName "kube-api-access-hxrtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.510043 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw" (OuterVolumeSpecName: "kube-api-access-bp8bw") pod "7e2005e0-31d4-408f-8c66-187a6dd37bcd" (UID: "7e2005e0-31d4-408f-8c66-187a6dd37bcd"). InnerVolumeSpecName "kube-api-access-bp8bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.521799 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg" (OuterVolumeSpecName: "kube-api-access-rvkbg") pod "4feedd65-778f-471c-a2bf-23af2e459685" (UID: "4feedd65-778f-471c-a2bf-23af2e459685"). InnerVolumeSpecName "kube-api-access-rvkbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602433 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") pod \"684fc9d7-94f0-418a-b059-e5519e6cd316\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602509 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") pod \"684fc9d7-94f0-418a-b059-e5519e6cd316\" (UID: \"684fc9d7-94f0-418a-b059-e5519e6cd316\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602543 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602627 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602650 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602691 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.602867 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") pod \"ced7b045-00ec-453d-9a56-b13132991e8c\" (UID: \"ced7b045-00ec-453d-9a56-b13132991e8c\") " Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603360 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvkbg\" (UniqueName: \"kubernetes.io/projected/4feedd65-778f-471c-a2bf-23af2e459685-kube-api-access-rvkbg\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603389 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp8bw\" (UniqueName: \"kubernetes.io/projected/7e2005e0-31d4-408f-8c66-187a6dd37bcd-kube-api-access-bp8bw\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603405 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxrtr\" (UniqueName: \"kubernetes.io/projected/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-kube-api-access-hxrtr\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603419 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603451 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4feedd65-778f-471c-a2bf-23af2e459685-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.603465 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2005e0-31d4-408f-8c66-187a6dd37bcd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.604280 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "684fc9d7-94f0-418a-b059-e5519e6cd316" (UID: "684fc9d7-94f0-418a-b059-e5519e6cd316"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.607805 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk" (OuterVolumeSpecName: "kube-api-access-lmmgk") pod "684fc9d7-94f0-418a-b059-e5519e6cd316" (UID: "684fc9d7-94f0-418a-b059-e5519e6cd316"). InnerVolumeSpecName "kube-api-access-lmmgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.611401 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg" (OuterVolumeSpecName: "kube-api-access-p7hmg") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "kube-api-access-p7hmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.645345 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.649337 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.655804 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config" (OuterVolumeSpecName: "config") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.668840 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ced7b045-00ec-453d-9a56-b13132991e8c" (UID: "ced7b045-00ec-453d-9a56-b13132991e8c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705480 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705524 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705538 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705551 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ced7b045-00ec-453d-9a56-b13132991e8c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705564 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7hmg\" (UniqueName: \"kubernetes.io/projected/ced7b045-00ec-453d-9a56-b13132991e8c-kube-api-access-p7hmg\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705611 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/684fc9d7-94f0-418a-b059-e5519e6cd316-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:03 crc kubenswrapper[4962]: I0220 10:14:03.705624 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmmgk\" (UniqueName: \"kubernetes.io/projected/684fc9d7-94f0-418a-b059-e5519e6cd316-kube-api-access-lmmgk\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.326903 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m26vd" event={"ID":"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408","Type":"ContainerStarted","Data":"d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde"} Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.341308 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-758kd" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.344874 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.345945 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4hwp2" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.346612 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-qp96t" event={"ID":"ced7b045-00ec-453d-9a56-b13132991e8c","Type":"ContainerDied","Data":"e9818b1a3f9f3197b15f5f2de8df4aeac94c0b0051e4206684bcec3fc52e8885"} Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.346710 4962 scope.go:117] "RemoveContainer" containerID="025a5380bda7921dcbb477c483405adc20d83c027f38ca77224085c6cba7f4f3" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.346988 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695f-account-create-update-22t44" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.347211 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-44g6w" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.365788 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-m26vd" podStartSLOduration=3.169373699 podStartE2EDuration="9.365754413s" podCreationTimestamp="2026-02-20 10:13:55 +0000 UTC" firstStartedPulling="2026-02-20 10:13:56.976833127 +0000 UTC m=+1128.559304973" lastFinishedPulling="2026-02-20 10:14:03.173213831 +0000 UTC m=+1134.755685687" observedRunningTime="2026-02-20 10:14:04.359316863 +0000 UTC m=+1135.941788749" watchObservedRunningTime="2026-02-20 10:14:04.365754413 +0000 UTC m=+1135.948226299" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.409909 4962 scope.go:117] "RemoveContainer" containerID="7ec00e9b2989f478e117f8d08060d562a11edc343ce310e24fa477348d6aca1b" Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.463662 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:14:04 crc kubenswrapper[4962]: I0220 10:14:04.469737 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-qp96t"] Feb 20 10:14:05 crc kubenswrapper[4962]: I0220 10:14:05.161442 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" path="/var/lib/kubelet/pods/ced7b045-00ec-453d-9a56-b13132991e8c/volumes" Feb 20 10:14:06 crc kubenswrapper[4962]: E0220 10:14:06.746794 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d2e7f05_f1f0_4619_ae07_0a7b93ad6408.slice/crio-d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d2e7f05_f1f0_4619_ae07_0a7b93ad6408.slice/crio-conmon-d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:14:07 crc kubenswrapper[4962]: I0220 10:14:07.378964 4962 generic.go:334] "Generic (PLEG): container finished" podID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" containerID="d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde" exitCode=0 Feb 20 10:14:07 crc kubenswrapper[4962]: I0220 10:14:07.379035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m26vd" event={"ID":"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408","Type":"ContainerDied","Data":"d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde"} Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.835030 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m26vd" Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.916995 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") pod \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.917191 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") pod \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.917238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") pod \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\" (UID: \"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408\") " Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.936501 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr" (OuterVolumeSpecName: "kube-api-access-qmgqr") pod "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" (UID: "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408"). InnerVolumeSpecName "kube-api-access-qmgqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.946707 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" (UID: "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:08 crc kubenswrapper[4962]: I0220 10:14:08.985428 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data" (OuterVolumeSpecName: "config-data") pod "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" (UID: "2d2e7f05-f1f0-4619-ae07-0a7b93ad6408"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.019810 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.019849 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.019861 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmgqr\" (UniqueName: \"kubernetes.io/projected/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408-kube-api-access-qmgqr\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.418248 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m26vd" event={"ID":"2d2e7f05-f1f0-4619-ae07-0a7b93ad6408","Type":"ContainerDied","Data":"d9fb2f47a09b42f7a4e90c5cb0cc07c8e1190c6442bbdd0ca0c2a2429a245afe"} Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.418987 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9fb2f47a09b42f7a4e90c5cb0cc07c8e1190c6442bbdd0ca0c2a2429a245afe" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.419162 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m26vd" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.743568 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744250 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21296df9-6e67-4427-959d-8d67bfd1393b" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744313 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="21296df9-6e67-4427-959d-8d67bfd1393b" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744373 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684fc9d7-94f0-418a-b059-e5519e6cd316" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744443 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="684fc9d7-94f0-418a-b059-e5519e6cd316" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744496 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e761565e-55de-43bc-b82d-95b776652b5c" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744540 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="e761565e-55de-43bc-b82d-95b776652b5c" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744607 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" containerName="keystone-db-sync" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744655 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" containerName="keystone-db-sync" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744708 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4feedd65-778f-471c-a2bf-23af2e459685" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744752 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4feedd65-778f-471c-a2bf-23af2e459685" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744803 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="init" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744848 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="init" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.744895 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="dnsmasq-dns" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.744938 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="dnsmasq-dns" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.745007 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745055 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: E0220 10:14:09.745125 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745177 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745393 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4feedd65-778f-471c-a2bf-23af2e459685" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745451 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced7b045-00ec-453d-9a56-b13132991e8c" containerName="dnsmasq-dns" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745507 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="21296df9-6e67-4427-959d-8d67bfd1393b" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745560 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745631 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" containerName="mariadb-account-create-update" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745688 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="684fc9d7-94f0-418a-b059-e5519e6cd316" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745738 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="e761565e-55de-43bc-b82d-95b776652b5c" containerName="mariadb-database-create" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.745792 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" containerName="keystone-db-sync" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.746970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.791047 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.838159 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.841560 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.844837 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846184 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846206 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846229 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.846394 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.847738 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.847784 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.847752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.847971 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-76ldt" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.882472 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949004 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949083 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949142 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949201 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949238 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949276 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949328 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949394 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949474 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949560 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949806 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.949884 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.951673 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.954478 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.958219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.958622 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:09 crc kubenswrapper[4962]: I0220 10:14:09.958760 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.032583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") pod \"dnsmasq-dns-84f6cc7f47-vmlll\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.060256 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.060522 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.060653 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.060760 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.061050 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.061126 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.085768 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.086512 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.089102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.094861 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.106450 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.121996 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") pod \"keystone-bootstrap-fc9c5\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.144022 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.214954 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.216078 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.216348 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.226640 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-87fmw" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.226955 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.227162 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.241656 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.243163 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.253014 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.253447 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bnxb6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.260099 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.266692 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.302473 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370161 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370321 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370396 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370485 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370557 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370668 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370761 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.370832 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.378737 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.442069 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.443455 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.455314 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.462950 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gtm5t" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.463095 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.464422 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.470511 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.485906 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.486119 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.486390 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bt79l" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.486578 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.487729 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488552 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488668 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488763 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488835 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488918 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.488989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.496732 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.499169 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.498180 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.498224 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.495563 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.514882 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.517500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.524974 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.525423 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.537256 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.538359 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") pod \"cinder-db-sync-s4qgr\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.545736 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.546493 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") pod \"neutron-db-sync-v7sjh\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.570043 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.575271 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.575557 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.609753 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.609887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.609946 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610025 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610141 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.610422 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.613390 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.635668 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.646632 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.646955 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.672059 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.718724 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.718798 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.718859 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.718885 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.719026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.719728 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720730 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720782 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720825 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.720986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721010 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721034 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721063 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721111 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.721887 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.732893 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.733936 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.734115 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.740749 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.743484 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") pod \"barbican-db-sync-smcqr\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.743750 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.752708 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") pod \"placement-db-sync-mk67n\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824140 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824278 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824348 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824383 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824402 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824425 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824450 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824483 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824521 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824537 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824554 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.824576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.830843 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.831015 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.831249 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.831481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.838608 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.839408 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.839696 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.860416 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") pod \"ceilometer-0\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.860659 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.917338 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928684 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928800 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928838 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.928889 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.929077 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.929727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.930259 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.930832 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.931479 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.933719 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.954063 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") pod \"dnsmasq-dns-68bc8f6695-d6bm6\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:10 crc kubenswrapper[4962]: I0220 10:14:10.968108 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.014692 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.026541 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.026843 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.031142 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lzhn7" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.031383 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.031523 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.031795 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.061107 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139169 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139247 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139312 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139342 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139401 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139484 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139558 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.139585 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.157928 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.167809 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.167996 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.171290 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.171608 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244094 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244163 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244278 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244324 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244368 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244398 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.244473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.247562 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.248650 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.249813 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.256155 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.257873 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.259454 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.274112 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.295646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.299281 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.308298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.346960 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347044 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347078 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347100 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347165 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.347232 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.378403 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449555 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449695 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449741 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449872 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449913 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.449961 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.451288 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.451821 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.459557 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.461920 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.472639 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.462054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.481560 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.483415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.511618 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.512745 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fc9c5" event={"ID":"8741789b-8f62-4fc9-b811-b48d1f72658b","Type":"ContainerStarted","Data":"45fbf95945cbd43ce30cabe021a4e7f4f8190da7cdb4b82be6352fe237f87d78"} Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.516746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s4qgr" event={"ID":"14c237ea-eb42-49d4-90db-ee57e3b560e3","Type":"ContainerStarted","Data":"56b25d7c906db1005eebceb9a0a6f02f1965ac71cc6b3cb440b4767a03118405"} Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.525898 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" event={"ID":"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3","Type":"ContainerStarted","Data":"306dc819f0dbd42e6bb0a1d32df3e770a0d5883fd79bec0c6e92d720cd24ed11"} Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.528724 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.613651 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.658128 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.686718 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:14:11 crc kubenswrapper[4962]: W0220 10:14:11.693020 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf95ae2eb_8d20_4549_896d_e6991bfd1e06.slice/crio-15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb WatchSource:0}: Error finding container 15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb: Status 404 returned error can't find the container with id 15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.699546 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.801744 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:11 crc kubenswrapper[4962]: I0220 10:14:11.973663 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.309120 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:12 crc kubenswrapper[4962]: W0220 10:14:12.336839 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e102633_92b3_4a5f_952b_9b3d5d5c8642.slice/crio-115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9 WatchSource:0}: Error finding container 115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9: Status 404 returned error can't find the container with id 115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9 Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.551724 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mk67n" event={"ID":"97e25820-62eb-4ad9-92ad-471c2f0f7ed4","Type":"ContainerStarted","Data":"15539448337a5b961d9b0ef7e9cec1129487956e6df6174e8cd859d99a2fb5ff"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.555060 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerStarted","Data":"30186548c67af6b38295049a03fc70d8716a830b2fea8ffe32d0d440d68c2923"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.557861 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v7sjh" event={"ID":"6b114dbd-1f72-42c9-97c1-43795d1cf1ea","Type":"ContainerStarted","Data":"c4884098169c655124365602e35fd187fa28c946c8e4d3fb080909fa29ad7ae0"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.557888 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v7sjh" event={"ID":"6b114dbd-1f72-42c9-97c1-43795d1cf1ea","Type":"ContainerStarted","Data":"dce69904baa73ecb44c9d47c1bcf4bebcf4df75f9166693b47df46309236c541"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.563645 4962 generic.go:334] "Generic (PLEG): container finished" podID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" containerID="0f06d40b0139e2fd2c899292daba5486f90405f7482adbe25ea92101fbe15c2e" exitCode=0 Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.563720 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" event={"ID":"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3","Type":"ContainerDied","Data":"0f06d40b0139e2fd2c899292daba5486f90405f7482adbe25ea92101fbe15c2e"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.567776 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fc9c5" event={"ID":"8741789b-8f62-4fc9-b811-b48d1f72658b","Type":"ContainerStarted","Data":"06ae0aace60b853c3274af8b59ad6fe8fb46d990b1106c40d7696cbaaa47e13b"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.570469 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerStarted","Data":"15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.578666 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smcqr" event={"ID":"d970dac6-1948-42dd-b5d9-c5df1b04e30d","Type":"ContainerStarted","Data":"e9a2fb5aa6c019bc1ceb09e0e16ebaf81860fd8433b5fa16ea5575cfba68806b"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.583109 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-v7sjh" podStartSLOduration=2.5830821090000002 podStartE2EDuration="2.583082109s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:12.574802391 +0000 UTC m=+1144.157274227" watchObservedRunningTime="2026-02-20 10:14:12.583082109 +0000 UTC m=+1144.165553955" Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.583319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerStarted","Data":"115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.605527 4962 generic.go:334] "Generic (PLEG): container finished" podID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerID="2ed06c9914b443038fe5a8020b56e2a2f1aa8bba18873866ee6a64f32e0d9f5e" exitCode=0 Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.605586 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerDied","Data":"2ed06c9914b443038fe5a8020b56e2a2f1aa8bba18873866ee6a64f32e0d9f5e"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.605639 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerStarted","Data":"05f3668ddf59db31b6f76b94605c89fd97fdc7f2e57b881b4ff06bffb9a82723"} Feb 20 10:14:12 crc kubenswrapper[4962]: I0220 10:14:12.616860 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fc9c5" podStartSLOduration=3.616837328 podStartE2EDuration="3.616837328s" podCreationTimestamp="2026-02-20 10:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:12.601256364 +0000 UTC m=+1144.183728210" watchObservedRunningTime="2026-02-20 10:14:12.616837328 +0000 UTC m=+1144.199309174" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.201762 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.337870 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.337939 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.338041 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.338159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.338252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.338404 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") pod \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\" (UID: \"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3\") " Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.355371 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd" (OuterVolumeSpecName: "kube-api-access-t42qd") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "kube-api-access-t42qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.399130 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.412431 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.416552 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.423961 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.430442 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config" (OuterVolumeSpecName: "config") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.432898 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" (UID: "a4660bc9-1b06-4b54-b524-6bfd77e6c1f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.441609 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.441840 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.441906 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.441959 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t42qd\" (UniqueName: \"kubernetes.io/projected/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-kube-api-access-t42qd\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.442014 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.442065 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.491946 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.510304 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.644774 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerStarted","Data":"9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234"} Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.646070 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.655095 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerStarted","Data":"e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14"} Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.672995 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.673702 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84f6cc7f47-vmlll" event={"ID":"a4660bc9-1b06-4b54-b524-6bfd77e6c1f3","Type":"ContainerDied","Data":"306dc819f0dbd42e6bb0a1d32df3e770a0d5883fd79bec0c6e92d720cd24ed11"} Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.673781 4962 scope.go:117] "RemoveContainer" containerID="0f06d40b0139e2fd2c899292daba5486f90405f7482adbe25ea92101fbe15c2e" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.688376 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerStarted","Data":"30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72"} Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.699946 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" podStartSLOduration=3.699909776 podStartE2EDuration="3.699909776s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:13.680605066 +0000 UTC m=+1145.263076912" watchObservedRunningTime="2026-02-20 10:14:13.699909776 +0000 UTC m=+1145.282381622" Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.763085 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:13 crc kubenswrapper[4962]: I0220 10:14:13.773447 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84f6cc7f47-vmlll"] Feb 20 10:14:14 crc kubenswrapper[4962]: I0220 10:14:14.700218 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerStarted","Data":"6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa"} Feb 20 10:14:14 crc kubenswrapper[4962]: I0220 10:14:14.700404 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-log" containerID="cri-o://e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14" gracePeriod=30 Feb 20 10:14:14 crc kubenswrapper[4962]: I0220 10:14:14.700460 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-httpd" containerID="cri-o://6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa" gracePeriod=30 Feb 20 10:14:14 crc kubenswrapper[4962]: I0220 10:14:14.724837 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.724817835 podStartE2EDuration="5.724817835s" podCreationTimestamp="2026-02-20 10:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:14.722695719 +0000 UTC m=+1146.305167565" watchObservedRunningTime="2026-02-20 10:14:14.724817835 +0000 UTC m=+1146.307289681" Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.169933 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" path="/var/lib/kubelet/pods/a4660bc9-1b06-4b54-b524-6bfd77e6c1f3/volumes" Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.717945 4962 generic.go:334] "Generic (PLEG): container finished" podID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerID="6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa" exitCode=0 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.718302 4962 generic.go:334] "Generic (PLEG): container finished" podID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerID="e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14" exitCode=143 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.718019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerDied","Data":"6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa"} Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.718408 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerDied","Data":"e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14"} Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.723041 4962 generic.go:334] "Generic (PLEG): container finished" podID="8741789b-8f62-4fc9-b811-b48d1f72658b" containerID="06ae0aace60b853c3274af8b59ad6fe8fb46d990b1106c40d7696cbaaa47e13b" exitCode=0 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.723190 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fc9c5" event={"ID":"8741789b-8f62-4fc9-b811-b48d1f72658b","Type":"ContainerDied","Data":"06ae0aace60b853c3274af8b59ad6fe8fb46d990b1106c40d7696cbaaa47e13b"} Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.726216 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerStarted","Data":"2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1"} Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.726356 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-log" containerID="cri-o://30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72" gracePeriod=30 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.726484 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-httpd" containerID="cri-o://2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1" gracePeriod=30 Feb 20 10:14:15 crc kubenswrapper[4962]: I0220 10:14:15.776922 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.776894659 podStartE2EDuration="5.776894659s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:15.767351203 +0000 UTC m=+1147.349823049" watchObservedRunningTime="2026-02-20 10:14:15.776894659 +0000 UTC m=+1147.359366505" Feb 20 10:14:16 crc kubenswrapper[4962]: I0220 10:14:16.744139 4962 generic.go:334] "Generic (PLEG): container finished" podID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerID="2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1" exitCode=0 Feb 20 10:14:16 crc kubenswrapper[4962]: I0220 10:14:16.744229 4962 generic.go:334] "Generic (PLEG): container finished" podID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerID="30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72" exitCode=143 Feb 20 10:14:16 crc kubenswrapper[4962]: I0220 10:14:16.744246 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerDied","Data":"2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1"} Feb 20 10:14:16 crc kubenswrapper[4962]: I0220 10:14:16.744337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerDied","Data":"30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72"} Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.374552 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.450839 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.450902 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.450952 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.450984 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.451035 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.451104 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") pod \"8741789b-8f62-4fc9-b811-b48d1f72658b\" (UID: \"8741789b-8f62-4fc9-b811-b48d1f72658b\") " Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.460676 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.465002 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh" (OuterVolumeSpecName: "kube-api-access-zchrh") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "kube-api-access-zchrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.474118 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.484121 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts" (OuterVolumeSpecName: "scripts") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.490066 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.508449 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data" (OuterVolumeSpecName: "config-data") pod "8741789b-8f62-4fc9-b811-b48d1f72658b" (UID: "8741789b-8f62-4fc9-b811-b48d1f72658b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.553723 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.553768 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.553781 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.553791 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.554181 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8741789b-8f62-4fc9-b811-b48d1f72658b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.554197 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zchrh\" (UniqueName: \"kubernetes.io/projected/8741789b-8f62-4fc9-b811-b48d1f72658b-kube-api-access-zchrh\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.764530 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fc9c5" event={"ID":"8741789b-8f62-4fc9-b811-b48d1f72658b","Type":"ContainerDied","Data":"45fbf95945cbd43ce30cabe021a4e7f4f8190da7cdb4b82be6352fe237f87d78"} Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.764582 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45fbf95945cbd43ce30cabe021a4e7f4f8190da7cdb4b82be6352fe237f87d78" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.764679 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fc9c5" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.823674 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.833877 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fc9c5"] Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.917940 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:14:17 crc kubenswrapper[4962]: E0220 10:14:17.919006 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" containerName="init" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.919116 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" containerName="init" Feb 20 10:14:17 crc kubenswrapper[4962]: E0220 10:14:17.919224 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8741789b-8f62-4fc9-b811-b48d1f72658b" containerName="keystone-bootstrap" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.919301 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8741789b-8f62-4fc9-b811-b48d1f72658b" containerName="keystone-bootstrap" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.919953 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8741789b-8f62-4fc9-b811-b48d1f72658b" containerName="keystone-bootstrap" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.920173 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4660bc9-1b06-4b54-b524-6bfd77e6c1f3" containerName="init" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.921230 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.925214 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.926101 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.926702 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.926964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-76ldt" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.927280 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.930838 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962692 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962753 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962834 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:17 crc kubenswrapper[4962]: I0220 10:14:17.962902 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064154 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064235 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064263 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064284 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064332 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.064393 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.069004 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.069342 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.069689 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.078372 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.078700 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.085829 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") pod \"keystone-bootstrap-r4hdf\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:18 crc kubenswrapper[4962]: I0220 10:14:18.242990 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:19 crc kubenswrapper[4962]: I0220 10:14:19.154358 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8741789b-8f62-4fc9-b811-b48d1f72658b" path="/var/lib/kubelet/pods/8741789b-8f62-4fc9-b811-b48d1f72658b/volumes" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.333994 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.465705 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.465815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.465897 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.466057 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.468267 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.468336 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.468520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.468577 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5f79e11c-6024-4774-9ed7-6d08e5b63442\" (UID: \"5f79e11c-6024-4774-9ed7-6d08e5b63442\") " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.469457 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs" (OuterVolumeSpecName: "logs") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.469578 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.477793 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb" (OuterVolumeSpecName: "kube-api-access-2hkdb") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "kube-api-access-2hkdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.477895 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.491739 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts" (OuterVolumeSpecName: "scripts") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.518992 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.536756 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data" (OuterVolumeSpecName: "config-data") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.552467 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5f79e11c-6024-4774-9ed7-6d08e5b63442" (UID: "5f79e11c-6024-4774-9ed7-6d08e5b63442"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572288 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572348 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572366 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f79e11c-6024-4774-9ed7-6d08e5b63442-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572421 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572435 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572448 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hkdb\" (UniqueName: \"kubernetes.io/projected/5f79e11c-6024-4774-9ed7-6d08e5b63442-kube-api-access-2hkdb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572461 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.572472 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f79e11c-6024-4774-9ed7-6d08e5b63442-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.601557 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.676778 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.798640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f79e11c-6024-4774-9ed7-6d08e5b63442","Type":"ContainerDied","Data":"30186548c67af6b38295049a03fc70d8716a830b2fea8ffe32d0d440d68c2923"} Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.798921 4962 scope.go:117] "RemoveContainer" containerID="6554ff74166258fd2f7adf967daf521f835c9f3c82b3de1399a5e583de7355aa" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.798986 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.855764 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.870995 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.882879 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:20 crc kubenswrapper[4962]: E0220 10:14:20.883441 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-log" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.883460 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-log" Feb 20 10:14:20 crc kubenswrapper[4962]: E0220 10:14:20.883497 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-httpd" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.883507 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-httpd" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.883719 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-log" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.883744 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" containerName="glance-httpd" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.885221 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.890201 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.890634 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.892424 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.970762 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983012 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983060 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983103 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983144 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983161 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983263 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:20 crc kubenswrapper[4962]: I0220 10:14:20.983364 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.040578 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.040919 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" containerID="cri-o://e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab" gracePeriod=10 Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.085821 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086030 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086080 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086114 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086172 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086257 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086295 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.086323 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.087298 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.087514 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.088101 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.095321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.095329 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.096823 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.099525 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.108286 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.148864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.167028 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f79e11c-6024-4774-9ed7-6d08e5b63442" path="/var/lib/kubelet/pods/5f79e11c-6024-4774-9ed7-6d08e5b63442/volumes" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.215350 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.829974 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.837749 4962 generic.go:334] "Generic (PLEG): container finished" podID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerID="e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab" exitCode=0 Feb 20 10:14:21 crc kubenswrapper[4962]: I0220 10:14:21.837813 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerDied","Data":"e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab"} Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.461156 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.546864 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.548271 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.548322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.548456 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.548776 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549077 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549345 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549420 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549457 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") pod \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\" (UID: \"5e102633-92b3-4a5f-952b-9b3d5d5c8642\") " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.549686 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs" (OuterVolumeSpecName: "logs") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.550454 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.550488 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e102633-92b3-4a5f-952b-9b3d5d5c8642-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.556898 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.568592 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw" (OuterVolumeSpecName: "kube-api-access-dv9jw") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "kube-api-access-dv9jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.569344 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts" (OuterVolumeSpecName: "scripts") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.606002 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.621114 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data" (OuterVolumeSpecName: "config-data") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.632303 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5e102633-92b3-4a5f-952b-9b3d5d5c8642" (UID: "5e102633-92b3-4a5f-952b-9b3d5d5c8642"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.652970 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653013 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653071 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653086 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653106 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv9jw\" (UniqueName: \"kubernetes.io/projected/5e102633-92b3-4a5f-952b-9b3d5d5c8642-kube-api-access-dv9jw\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.653129 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e102633-92b3-4a5f-952b-9b3d5d5c8642-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.674551 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.755133 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.873449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5e102633-92b3-4a5f-952b-9b3d5d5c8642","Type":"ContainerDied","Data":"115b83326f93f543740443d2e3a49f92750a0b5b654fc8aa7d5dccbed06506d9"} Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.873830 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.917153 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.925520 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942139 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:23 crc kubenswrapper[4962]: E0220 10:14:23.942641 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-httpd" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942664 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-httpd" Feb 20 10:14:23 crc kubenswrapper[4962]: E0220 10:14:23.942692 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-log" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942699 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-log" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942884 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-log" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.942913 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" containerName="glance-httpd" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.944167 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.947789 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.948010 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 10:14:23 crc kubenswrapper[4962]: I0220 10:14:23.970021 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062257 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062334 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062378 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062396 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062438 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062459 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062533 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.062559 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.164970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165083 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165117 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165440 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165804 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.165971 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166020 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166040 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.166922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.171322 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.173271 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.175441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.177345 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.183043 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.197712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " pod="openstack/glance-default-external-api-0" Feb 20 10:14:24 crc kubenswrapper[4962]: I0220 10:14:24.270826 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:14:25 crc kubenswrapper[4962]: I0220 10:14:25.161452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e102633-92b3-4a5f-952b-9b3d5d5c8642" path="/var/lib/kubelet/pods/5e102633-92b3-4a5f-952b-9b3d5d5c8642/volumes" Feb 20 10:14:26 crc kubenswrapper[4962]: I0220 10:14:26.831396 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Feb 20 10:14:29 crc kubenswrapper[4962]: I0220 10:14:29.953619 4962 generic.go:334] "Generic (PLEG): container finished" podID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" containerID="c4884098169c655124365602e35fd187fa28c946c8e4d3fb080909fa29ad7ae0" exitCode=0 Feb 20 10:14:29 crc kubenswrapper[4962]: I0220 10:14:29.953720 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v7sjh" event={"ID":"6b114dbd-1f72-42c9-97c1-43795d1cf1ea","Type":"ContainerDied","Data":"c4884098169c655124365602e35fd187fa28c946c8e4d3fb080909fa29ad7ae0"} Feb 20 10:14:31 crc kubenswrapper[4962]: I0220 10:14:31.829662 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Feb 20 10:14:31 crc kubenswrapper[4962]: I0220 10:14:31.830349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:14:32 crc kubenswrapper[4962]: E0220 10:14:32.787793 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec" Feb 20 10:14:32 crc kubenswrapper[4962]: E0220 10:14:32.788569 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wpd4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-smcqr_openstack(d970dac6-1948-42dd-b5d9-c5df1b04e30d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:14:32 crc kubenswrapper[4962]: E0220 10:14:32.790029 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-smcqr" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" Feb 20 10:14:33 crc kubenswrapper[4962]: E0220 10:14:33.018811 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec\\\"\"" pod="openstack/barbican-db-sync-smcqr" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" Feb 20 10:14:33 crc kubenswrapper[4962]: E0220 10:14:33.314773 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4" Feb 20 10:14:33 crc kubenswrapper[4962]: E0220 10:14:33.315004 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55dh556h5c8h587h7bh669h54bh66fh5bdh679h5bdh68dhcdh85h5dfh694h5cfh54h58ch646h77h5b9h5bh6bhb4h5cfh666h559h687h555h68ch5dcq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdggc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(f95ae2eb-8d20-4549-896d-e6991bfd1e06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.525995 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.576711 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") pod \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.576925 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") pod \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.576969 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") pod \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\" (UID: \"6b114dbd-1f72-42c9-97c1-43795d1cf1ea\") " Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.591223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq" (OuterVolumeSpecName: "kube-api-access-9jjsq") pod "6b114dbd-1f72-42c9-97c1-43795d1cf1ea" (UID: "6b114dbd-1f72-42c9-97c1-43795d1cf1ea"). InnerVolumeSpecName "kube-api-access-9jjsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.609302 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b114dbd-1f72-42c9-97c1-43795d1cf1ea" (UID: "6b114dbd-1f72-42c9-97c1-43795d1cf1ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.614410 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config" (OuterVolumeSpecName: "config") pod "6b114dbd-1f72-42c9-97c1-43795d1cf1ea" (UID: "6b114dbd-1f72-42c9-97c1-43795d1cf1ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.678305 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.678354 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjsq\" (UniqueName: \"kubernetes.io/projected/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-kube-api-access-9jjsq\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:33 crc kubenswrapper[4962]: I0220 10:14:33.678370 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b114dbd-1f72-42c9-97c1-43795d1cf1ea-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.021430 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v7sjh" event={"ID":"6b114dbd-1f72-42c9-97c1-43795d1cf1ea","Type":"ContainerDied","Data":"dce69904baa73ecb44c9d47c1bcf4bebcf4df75f9166693b47df46309236c541"} Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.021542 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce69904baa73ecb44c9d47c1bcf4bebcf4df75f9166693b47df46309236c541" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.021692 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v7sjh" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.862287 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:14:34 crc kubenswrapper[4962]: E0220 10:14:34.863102 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" containerName="neutron-db-sync" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.863123 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" containerName="neutron-db-sync" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.863331 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" containerName="neutron-db-sync" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.864327 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.907993 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908055 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908117 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908134 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.908151 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.919242 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.920812 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.927769 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.928208 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bnxb6" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.928484 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.930622 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.952576 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:14:34 crc kubenswrapper[4962]: I0220 10:14:34.962193 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009075 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009170 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009193 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009255 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009277 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009302 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009352 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.009380 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.010237 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.010387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.012163 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.012222 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.012931 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.048293 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") pod \"dnsmasq-dns-77d55b9c69-9hhv4\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.111928 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.111996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.112032 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.112158 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.112207 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.118394 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.118646 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.119025 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.129780 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.130386 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") pod \"neutron-ffdf447d4-qtmvr\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.200549 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.253971 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.436697 4962 scope.go:117] "RemoveContainer" containerID="e1e172d0057e84b49e9b441fa013beb45918149a2c5015cdbce1606c11ce1b14" Feb 20 10:14:35 crc kubenswrapper[4962]: E0220 10:14:35.458460 4962 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 20 10:14:35 crc kubenswrapper[4962]: E0220 10:14:35.458765 4962 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-frwwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-s4qgr_openstack(14c237ea-eb42-49d4-90db-ee57e3b560e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 10:14:35 crc kubenswrapper[4962]: E0220 10:14:35.460526 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-s4qgr" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.558932 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.588225 4962 scope.go:117] "RemoveContainer" containerID="2123a048dc318acb29d67e4388556f6e325d9c8f02443300d2653d8afdf953b1" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.678920 4962 scope.go:117] "RemoveContainer" containerID="30e065f1b9b01eb8f421805f853435dbc45f2b1b158fe7f5032f2d843493ba72" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727467 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727584 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727619 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727686 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.727802 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") pod \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\" (UID: \"90cdf678-dd6c-4f3b-a675-4803eddcfc44\") " Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.759911 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg" (OuterVolumeSpecName: "kube-api-access-wfvtg") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "kube-api-access-wfvtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.807815 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.831573 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfvtg\" (UniqueName: \"kubernetes.io/projected/90cdf678-dd6c-4f3b-a675-4803eddcfc44-kube-api-access-wfvtg\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.831622 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.863450 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.870218 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.875806 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.907181 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config" (OuterVolumeSpecName: "config") pod "90cdf678-dd6c-4f3b-a675-4803eddcfc44" (UID: "90cdf678-dd6c-4f3b-a675-4803eddcfc44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.937208 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.937246 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.937267 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:35 crc kubenswrapper[4962]: I0220 10:14:35.937280 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cdf678-dd6c-4f3b-a675-4803eddcfc44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.010819 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.059380 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mk67n" event={"ID":"97e25820-62eb-4ad9-92ad-471c2f0f7ed4","Type":"ContainerStarted","Data":"6f635f1f56319fca1af13c4d65bb4a7c7d012f95348309539e66bb9bc3885680"} Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.068022 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" event={"ID":"90cdf678-dd6c-4f3b-a675-4803eddcfc44","Type":"ContainerDied","Data":"83c0be5d3f4ec5fde60bc996f7952df21d1276689d0ec4493b4ba2dd90aa2879"} Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.068077 4962 scope.go:117] "RemoveContainer" containerID="e8217d8b6999873bb7898fa52e25d272e12c4c8cfaefad4b4afe6a4bca8b3bab" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.068206 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69577ff67f-kvhqf" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.078743 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r4hdf" event={"ID":"37eccece-549c-4b2f-b066-481b216d7ece","Type":"ContainerStarted","Data":"110ade6ee18c3222118e3294958d83a3df26c7aeb2099802b229b066aa852315"} Feb 20 10:14:36 crc kubenswrapper[4962]: E0220 10:14:36.089825 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-s4qgr" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.176757 4962 scope.go:117] "RemoveContainer" containerID="c3f233006bdf1d16d8946733067213908be75ed885abe76b0ea0e53fac4b17ed" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.204540 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mk67n" podStartSLOduration=4.569928049 podStartE2EDuration="26.204495078s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="2026-02-20 10:14:11.668057516 +0000 UTC m=+1143.250529362" lastFinishedPulling="2026-02-20 10:14:33.302624505 +0000 UTC m=+1164.885096391" observedRunningTime="2026-02-20 10:14:36.127101851 +0000 UTC m=+1167.709573697" watchObservedRunningTime="2026-02-20 10:14:36.204495078 +0000 UTC m=+1167.786966924" Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.288684 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.321191 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69577ff67f-kvhqf"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.349404 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.571929 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:14:36 crc kubenswrapper[4962]: I0220 10:14:36.709967 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.128238 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r4hdf" event={"ID":"37eccece-549c-4b2f-b066-481b216d7ece","Type":"ContainerStarted","Data":"f9cb69ce2f5869e2d5aa8f13c96033f3ed4a62ca0344285f07875e14d0de4351"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.136018 4962 generic.go:334] "Generic (PLEG): container finished" podID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerID="ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd" exitCode=0 Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.136109 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerDied","Data":"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.136136 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerStarted","Data":"48cb63ddc063927b98c0acd0bd8342b9608beaa369b2e5045eda23591cd6bfd4"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.170376 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r4hdf" podStartSLOduration=20.170335151 podStartE2EDuration="20.170335151s" podCreationTimestamp="2026-02-20 10:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:37.150309428 +0000 UTC m=+1168.732781274" watchObservedRunningTime="2026-02-20 10:14:37.170335151 +0000 UTC m=+1168.752806997" Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.173608 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" path="/var/lib/kubelet/pods/90cdf678-dd6c-4f3b-a675-4803eddcfc44/volumes" Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.185783 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerStarted","Data":"80442f73d0bb0b1518e80cca1be32921f22e35f7b8a9442cfe4b3a67ae521feb"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.212889 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerStarted","Data":"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.212937 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerStarted","Data":"81c44508e7e551a0ec9263f4a7d0314158cbc47cdfa61ceff1466d2aef98334e"} Feb 20 10:14:37 crc kubenswrapper[4962]: I0220 10:14:37.453089 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:14:37 crc kubenswrapper[4962]: W0220 10:14:37.865123 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb8bed08_cb47_42cb_a192_2545a14e4c4b.slice/crio-9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248 WatchSource:0}: Error finding container 9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248: Status 404 returned error can't find the container with id 9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248 Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.241562 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerStarted","Data":"9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248"} Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.247030 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerStarted","Data":"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563"} Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.248478 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.250166 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerStarted","Data":"eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9"} Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.258370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerStarted","Data":"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574"} Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.285241 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" podStartSLOduration=4.285214678 podStartE2EDuration="4.285214678s" podCreationTimestamp="2026-02-20 10:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:38.277549029 +0000 UTC m=+1169.860020875" watchObservedRunningTime="2026-02-20 10:14:38.285214678 +0000 UTC m=+1169.867686524" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.329172 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:14:38 crc kubenswrapper[4962]: E0220 10:14:38.329793 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="init" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.329810 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="init" Feb 20 10:14:38 crc kubenswrapper[4962]: E0220 10:14:38.329823 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.329829 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.330043 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cdf678-dd6c-4f3b-a675-4803eddcfc44" containerName="dnsmasq-dns" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.331093 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.341297 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.341484 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.362931 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.369842 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ffdf447d4-qtmvr" podStartSLOduration=4.369819358 podStartE2EDuration="4.369819358s" podCreationTimestamp="2026-02-20 10:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:38.334696216 +0000 UTC m=+1169.917168082" watchObservedRunningTime="2026-02-20 10:14:38.369819358 +0000 UTC m=+1169.952291204" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514336 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514431 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514457 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514544 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514572 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514616 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.514640 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619335 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619489 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619822 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619902 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619945 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.619979 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.631539 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.638694 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.638771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.639464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.641721 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.641738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.660417 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") pod \"neutron-747dfbc745-ndpzt\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:38 crc kubenswrapper[4962]: I0220 10:14:38.733854 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.284306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerStarted","Data":"d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f"} Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.293708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerStarted","Data":"c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca"} Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.330961 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerStarted","Data":"888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d"} Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.334616 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=19.334573087 podStartE2EDuration="19.334573087s" podCreationTimestamp="2026-02-20 10:14:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:39.317238198 +0000 UTC m=+1170.899710054" watchObservedRunningTime="2026-02-20 10:14:39.334573087 +0000 UTC m=+1170.917044923" Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.338275 4962 generic.go:334] "Generic (PLEG): container finished" podID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" containerID="6f635f1f56319fca1af13c4d65bb4a7c7d012f95348309539e66bb9bc3885680" exitCode=0 Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.338349 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mk67n" event={"ID":"97e25820-62eb-4ad9-92ad-471c2f0f7ed4","Type":"ContainerDied","Data":"6f635f1f56319fca1af13c4d65bb4a7c7d012f95348309539e66bb9bc3885680"} Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.338834 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:14:39 crc kubenswrapper[4962]: I0220 10:14:39.639635 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.372299 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerStarted","Data":"9861804d9a2a53f03b608fd61261417b654233e3abb2bbc1678d9e37df3e329e"} Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.823833 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918194 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918384 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918623 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918731 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") pod \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\" (UID: \"97e25820-62eb-4ad9-92ad-471c2f0f7ed4\") " Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.918901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs" (OuterVolumeSpecName: "logs") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.919456 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.925841 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw" (OuterVolumeSpecName: "kube-api-access-mpbgw") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "kube-api-access-mpbgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.926222 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts" (OuterVolumeSpecName: "scripts") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.948466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:40 crc kubenswrapper[4962]: I0220 10:14:40.950809 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data" (OuterVolumeSpecName: "config-data") pod "97e25820-62eb-4ad9-92ad-471c2f0f7ed4" (UID: "97e25820-62eb-4ad9-92ad-471c2f0f7ed4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.022404 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.022457 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.022471 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpbgw\" (UniqueName: \"kubernetes.io/projected/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-kube-api-access-mpbgw\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.022491 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e25820-62eb-4ad9-92ad-471c2f0f7ed4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.216181 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.216470 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.261124 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.271197 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.392201 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerStarted","Data":"ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441"} Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.407604 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mk67n" event={"ID":"97e25820-62eb-4ad9-92ad-471c2f0f7ed4","Type":"ContainerDied","Data":"15539448337a5b961d9b0ef7e9cec1129487956e6df6174e8cd859d99a2fb5ff"} Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.407666 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15539448337a5b961d9b0ef7e9cec1129487956e6df6174e8cd859d99a2fb5ff" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.407939 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mk67n" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.432465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerStarted","Data":"ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0"} Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.445656 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r4hdf" event={"ID":"37eccece-549c-4b2f-b066-481b216d7ece","Type":"ContainerDied","Data":"f9cb69ce2f5869e2d5aa8f13c96033f3ed4a62ca0344285f07875e14d0de4351"} Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.445577 4962 generic.go:334] "Generic (PLEG): container finished" podID="37eccece-549c-4b2f-b066-481b216d7ece" containerID="f9cb69ce2f5869e2d5aa8f13c96033f3ed4a62ca0344285f07875e14d0de4351" exitCode=0 Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.447821 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.447862 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.497864 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:14:41 crc kubenswrapper[4962]: E0220 10:14:41.498468 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" containerName="placement-db-sync" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.498490 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" containerName="placement-db-sync" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.498965 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" containerName="placement-db-sync" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.500281 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.506721 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.506964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.506983 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.507271 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.509006 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gtm5t" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.544232 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.647829 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.647883 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.647954 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.647979 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.648027 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.648069 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.648110 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750087 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750209 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750308 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.750332 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.752159 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.756847 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.757866 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.759956 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.760156 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.760982 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.770438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") pod \"placement-755cb8b5f4-zlzbb\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:41 crc kubenswrapper[4962]: I0220 10:14:41.828240 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.472057 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerStarted","Data":"bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f"} Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.472893 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.472950 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.514924 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.514897468 podStartE2EDuration="19.514897468s" podCreationTimestamp="2026-02-20 10:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:42.510960555 +0000 UTC m=+1174.093432411" watchObservedRunningTime="2026-02-20 10:14:42.514897468 +0000 UTC m=+1174.097369314" Feb 20 10:14:42 crc kubenswrapper[4962]: I0220 10:14:42.554157 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-747dfbc745-ndpzt" podStartSLOduration=4.554135908 podStartE2EDuration="4.554135908s" podCreationTimestamp="2026-02-20 10:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:42.547272514 +0000 UTC m=+1174.129744360" watchObservedRunningTime="2026-02-20 10:14:42.554135908 +0000 UTC m=+1174.136607754" Feb 20 10:14:43 crc kubenswrapper[4962]: I0220 10:14:43.486319 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerStarted","Data":"c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f"} Feb 20 10:14:43 crc kubenswrapper[4962]: I0220 10:14:43.486900 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerStarted","Data":"278c9072e567ac676f1ff447db5bfcb24f5eba477a61436baf57c6f5bf95aba9"} Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.271974 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.272528 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.327121 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.340242 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.497989 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.498069 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.671468 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:44 crc kubenswrapper[4962]: I0220 10:14:44.683789 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.099498 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.203755 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227405 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227742 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227764 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227801 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.227934 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") pod \"37eccece-549c-4b2f-b066-481b216d7ece\" (UID: \"37eccece-549c-4b2f-b066-481b216d7ece\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.284568 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g" (OuterVolumeSpecName: "kube-api-access-h688g") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "kube-api-access-h688g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.285744 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts" (OuterVolumeSpecName: "scripts") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.285859 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.292744 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.308739 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.309009 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="dnsmasq-dns" containerID="cri-o://9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234" gracePeriod=10 Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.334442 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.334482 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.334492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h688g\" (UniqueName: \"kubernetes.io/projected/37eccece-549c-4b2f-b066-481b216d7ece-kube-api-access-h688g\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.334501 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.377997 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.427275 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data" (OuterVolumeSpecName: "config-data") pod "37eccece-549c-4b2f-b066-481b216d7ece" (UID: "37eccece-549c-4b2f-b066-481b216d7ece"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.437312 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.437364 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37eccece-549c-4b2f-b066-481b216d7ece-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.512062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerStarted","Data":"bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce"} Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.512496 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.512540 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.521280 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r4hdf" event={"ID":"37eccece-549c-4b2f-b066-481b216d7ece","Type":"ContainerDied","Data":"110ade6ee18c3222118e3294958d83a3df26c7aeb2099802b229b066aa852315"} Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.521338 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="110ade6ee18c3222118e3294958d83a3df26c7aeb2099802b229b066aa852315" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.521388 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r4hdf" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.532847 4962 generic.go:334] "Generic (PLEG): container finished" podID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerID="9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234" exitCode=0 Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.533130 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerDied","Data":"9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234"} Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.548206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerStarted","Data":"3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85"} Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.550423 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-755cb8b5f4-zlzbb" podStartSLOduration=4.550402626 podStartE2EDuration="4.550402626s" podCreationTimestamp="2026-02-20 10:14:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:45.539855918 +0000 UTC m=+1177.122327764" watchObservedRunningTime="2026-02-20 10:14:45.550402626 +0000 UTC m=+1177.132874472" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.781209 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.844394 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.844515 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.844547 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.845379 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.845425 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.845530 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") pod \"857e020f-54d5-4980-90a2-f19d6f8b5008\" (UID: \"857e020f-54d5-4980-90a2-f19d6f8b5008\") " Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.849876 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl" (OuterVolumeSpecName: "kube-api-access-f5knl") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "kube-api-access-f5knl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.902088 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.908884 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.911407 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config" (OuterVolumeSpecName: "config") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.913170 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.927118 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "857e020f-54d5-4980-90a2-f19d6f8b5008" (UID: "857e020f-54d5-4980-90a2-f19d6f8b5008"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947323 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947444 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947516 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5knl\" (UniqueName: \"kubernetes.io/projected/857e020f-54d5-4980-90a2-f19d6f8b5008-kube-api-access-f5knl\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947575 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947688 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:45 crc kubenswrapper[4962]: I0220 10:14:45.947755 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/857e020f-54d5-4980-90a2-f19d6f8b5008-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.275106 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:14:46 crc kubenswrapper[4962]: E0220 10:14:46.275792 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="dnsmasq-dns" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.275868 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="dnsmasq-dns" Feb 20 10:14:46 crc kubenswrapper[4962]: E0220 10:14:46.275931 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="init" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.275993 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="init" Feb 20 10:14:46 crc kubenswrapper[4962]: E0220 10:14:46.276092 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eccece-549c-4b2f-b066-481b216d7ece" containerName="keystone-bootstrap" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.276156 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eccece-549c-4b2f-b066-481b216d7ece" containerName="keystone-bootstrap" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.276414 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" containerName="dnsmasq-dns" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.276480 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eccece-549c-4b2f-b066-481b216d7ece" containerName="keystone-bootstrap" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.277140 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.287777 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.288161 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.288489 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.288567 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-76ldt" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.288818 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.289333 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.313605 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353620 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353690 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353749 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353793 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353826 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353872 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353912 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.353938 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455063 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455468 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.455913 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.456047 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.456171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.456326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.462720 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.462738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.463484 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.463570 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.464020 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.469199 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.473482 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.478046 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") pod \"keystone-6b4c54c5d9-pqd8r\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.560364 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" event={"ID":"857e020f-54d5-4980-90a2-f19d6f8b5008","Type":"ContainerDied","Data":"05f3668ddf59db31b6f76b94605c89fd97fdc7f2e57b881b4ff06bffb9a82723"} Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.560460 4962 scope.go:117] "RemoveContainer" containerID="9cd5bd763453d1cd2f676217b16b284051d742fb526b5a0bebd50656b842e234" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.560387 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-d6bm6" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.597196 4962 scope.go:117] "RemoveContainer" containerID="2ed06c9914b443038fe5a8020b56e2a2f1aa8bba18873866ee6a64f32e0d9f5e" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.621137 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.627552 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.636889 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-d6bm6"] Feb 20 10:14:46 crc kubenswrapper[4962]: I0220 10:14:46.892877 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.155731 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="857e020f-54d5-4980-90a2-f19d6f8b5008" path="/var/lib/kubelet/pods/857e020f-54d5-4980-90a2-f19d6f8b5008/volumes" Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.157234 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.587383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b4c54c5d9-pqd8r" event={"ID":"d203fc44-5252-4dd2-98ae-66f9c139b5f5","Type":"ContainerStarted","Data":"57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14"} Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.587818 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b4c54c5d9-pqd8r" event={"ID":"d203fc44-5252-4dd2-98ae-66f9c139b5f5","Type":"ContainerStarted","Data":"cf5b12fd788026ff0304070e27b4ebd505b31b0d0e831a4ccd6e51bd8bb0b383"} Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.588205 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:14:47 crc kubenswrapper[4962]: I0220 10:14:47.609516 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6b4c54c5d9-pqd8r" podStartSLOduration=1.609493982 podStartE2EDuration="1.609493982s" podCreationTimestamp="2026-02-20 10:14:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:47.607649335 +0000 UTC m=+1179.190121181" watchObservedRunningTime="2026-02-20 10:14:47.609493982 +0000 UTC m=+1179.191965828" Feb 20 10:14:48 crc kubenswrapper[4962]: I0220 10:14:48.619650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smcqr" event={"ID":"d970dac6-1948-42dd-b5d9-c5df1b04e30d","Type":"ContainerStarted","Data":"a45c4081d1cfd44304d7f3d8b40910079cb39e233843a73a0bea91a01d00d686"} Feb 20 10:14:48 crc kubenswrapper[4962]: I0220 10:14:48.643125 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-smcqr" podStartSLOduration=2.703463313 podStartE2EDuration="38.643097592s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="2026-02-20 10:14:11.683073854 +0000 UTC m=+1143.265545700" lastFinishedPulling="2026-02-20 10:14:47.622708133 +0000 UTC m=+1179.205179979" observedRunningTime="2026-02-20 10:14:48.637290901 +0000 UTC m=+1180.219762747" watchObservedRunningTime="2026-02-20 10:14:48.643097592 +0000 UTC m=+1180.225569438" Feb 20 10:14:49 crc kubenswrapper[4962]: I0220 10:14:49.190763 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 10:14:49 crc kubenswrapper[4962]: I0220 10:14:49.677506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s4qgr" event={"ID":"14c237ea-eb42-49d4-90db-ee57e3b560e3","Type":"ContainerStarted","Data":"ce2c059ffddb8a4bd817e0bdef157eb8b02fa711cf3898a972dc3c9f08da8952"} Feb 20 10:14:49 crc kubenswrapper[4962]: I0220 10:14:49.696327 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-s4qgr" podStartSLOduration=2.415795337 podStartE2EDuration="39.696295821s" podCreationTimestamp="2026-02-20 10:14:10 +0000 UTC" firstStartedPulling="2026-02-20 10:14:11.250803902 +0000 UTC m=+1142.833275748" lastFinishedPulling="2026-02-20 10:14:48.531304386 +0000 UTC m=+1180.113776232" observedRunningTime="2026-02-20 10:14:49.693922826 +0000 UTC m=+1181.276394672" watchObservedRunningTime="2026-02-20 10:14:49.696295821 +0000 UTC m=+1181.278767667" Feb 20 10:14:53 crc kubenswrapper[4962]: I0220 10:14:53.246811 4962 generic.go:334] "Generic (PLEG): container finished" podID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" containerID="a45c4081d1cfd44304d7f3d8b40910079cb39e233843a73a0bea91a01d00d686" exitCode=0 Feb 20 10:14:53 crc kubenswrapper[4962]: I0220 10:14:53.248823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smcqr" event={"ID":"d970dac6-1948-42dd-b5d9-c5df1b04e30d","Type":"ContainerDied","Data":"a45c4081d1cfd44304d7f3d8b40910079cb39e233843a73a0bea91a01d00d686"} Feb 20 10:14:54 crc kubenswrapper[4962]: I0220 10:14:54.262562 4962 generic.go:334] "Generic (PLEG): container finished" podID="14c237ea-eb42-49d4-90db-ee57e3b560e3" containerID="ce2c059ffddb8a4bd817e0bdef157eb8b02fa711cf3898a972dc3c9f08da8952" exitCode=0 Feb 20 10:14:54 crc kubenswrapper[4962]: I0220 10:14:54.262725 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s4qgr" event={"ID":"14c237ea-eb42-49d4-90db-ee57e3b560e3","Type":"ContainerDied","Data":"ce2c059ffddb8a4bd817e0bdef157eb8b02fa711cf3898a972dc3c9f08da8952"} Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.705841 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.713049 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") pod \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874255 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874316 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874411 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") pod \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874459 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") pod \"14c237ea-eb42-49d4-90db-ee57e3b560e3\" (UID: \"14c237ea-eb42-49d4-90db-ee57e3b560e3\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.874500 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") pod \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\" (UID: \"d970dac6-1948-42dd-b5d9-c5df1b04e30d\") " Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.875039 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.879070 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts" (OuterVolumeSpecName: "scripts") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.880063 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.881834 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh" (OuterVolumeSpecName: "kube-api-access-frwwh") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "kube-api-access-frwwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.882439 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d970dac6-1948-42dd-b5d9-c5df1b04e30d" (UID: "d970dac6-1948-42dd-b5d9-c5df1b04e30d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.882971 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w" (OuterVolumeSpecName: "kube-api-access-wpd4w") pod "d970dac6-1948-42dd-b5d9-c5df1b04e30d" (UID: "d970dac6-1948-42dd-b5d9-c5df1b04e30d"). InnerVolumeSpecName "kube-api-access-wpd4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.903904 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.912916 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d970dac6-1948-42dd-b5d9-c5df1b04e30d" (UID: "d970dac6-1948-42dd-b5d9-c5df1b04e30d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: E0220 10:14:55.936122 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.958848 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data" (OuterVolumeSpecName: "config-data") pod "14c237ea-eb42-49d4-90db-ee57e3b560e3" (UID: "14c237ea-eb42-49d4-90db-ee57e3b560e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978705 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpd4w\" (UniqueName: \"kubernetes.io/projected/d970dac6-1948-42dd-b5d9-c5df1b04e30d-kube-api-access-wpd4w\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978749 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frwwh\" (UniqueName: \"kubernetes.io/projected/14c237ea-eb42-49d4-90db-ee57e3b560e3-kube-api-access-frwwh\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978767 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14c237ea-eb42-49d4-90db-ee57e3b560e3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978783 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978799 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978813 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978825 4962 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d970dac6-1948-42dd-b5d9-c5df1b04e30d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978835 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:55 crc kubenswrapper[4962]: I0220 10:14:55.978845 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c237ea-eb42-49d4-90db-ee57e3b560e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.289185 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-smcqr" event={"ID":"d970dac6-1948-42dd-b5d9-c5df1b04e30d","Type":"ContainerDied","Data":"e9a2fb5aa6c019bc1ceb09e0e16ebaf81860fd8433b5fa16ea5575cfba68806b"} Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.289245 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-smcqr" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.289272 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9a2fb5aa6c019bc1ceb09e0e16ebaf81860fd8433b5fa16ea5575cfba68806b" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.293662 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s4qgr" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.293670 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s4qgr" event={"ID":"14c237ea-eb42-49d4-90db-ee57e3b560e3","Type":"ContainerDied","Data":"56b25d7c906db1005eebceb9a0a6f02f1965ac71cc6b3cb440b4767a03118405"} Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.293723 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56b25d7c906db1005eebceb9a0a6f02f1965ac71cc6b3cb440b4767a03118405" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.296920 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerStarted","Data":"64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b"} Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.297193 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="ceilometer-notification-agent" containerID="cri-o://c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca" gracePeriod=30 Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.297862 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.297994 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="sg-core" containerID="cri-o://3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85" gracePeriod=30 Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.298063 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="proxy-httpd" containerID="cri-o://64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b" gracePeriod=30 Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.695786 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:14:56 crc kubenswrapper[4962]: E0220 10:14:56.696497 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" containerName="cinder-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.696512 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" containerName="cinder-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: E0220 10:14:56.696532 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" containerName="barbican-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.696539 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" containerName="barbican-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.696738 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" containerName="cinder-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.696789 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" containerName="barbican-db-sync" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.698212 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.702427 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-87fmw" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.715498 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.715736 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.715858 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.723100 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.724951 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.752392 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.770097 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807696 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807760 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807897 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807949 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.807971 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808053 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808073 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808111 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808138 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808156 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.808189 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909331 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909469 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909496 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909513 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909545 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909569 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909705 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909758 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909790 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.909805 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.910471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.910571 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.910780 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.918582 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.919885 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.920352 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.929920 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.930279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.930686 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") pod \"cinder-scheduler-0\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " pod="openstack/cinder-scheduler-0" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.937576 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.938344 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.948388 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") pod \"dnsmasq-dns-6c549fb5d5-4c9w4\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.981835 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.983501 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.991892 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.992040 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bt79l" Feb 20 10:14:56 crc kubenswrapper[4962]: I0220 10:14:56.992299 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011326 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011444 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011554 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.011622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.016694 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.022404 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.037657 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.052299 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.054464 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.062752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.073970 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116197 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116287 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116326 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116348 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116379 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116401 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116430 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116495 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.116514 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.151220 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.151862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.152332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.168545 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.184406 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") pod \"barbican-worker-6b8479d945-8wsh9\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.253400 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.255202 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.255322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.255399 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.255628 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.254940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.274207 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.284660 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.304940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.307863 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.310554 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.310711 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.313894 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.315525 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") pod \"barbican-keystone-listener-569d5979d6-xzr2q\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.353388 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.358895 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.358951 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.358996 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.359043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.359066 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.359252 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.359271 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.380417 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.384399 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.386480 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.390257 4962 generic.go:334] "Generic (PLEG): container finished" podID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerID="64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b" exitCode=0 Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.390289 4962 generic.go:334] "Generic (PLEG): container finished" podID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerID="3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85" exitCode=2 Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.390309 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerDied","Data":"64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b"} Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.390331 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerDied","Data":"3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85"} Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.394571 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.412450 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.419740 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.422876 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.429275 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461176 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461244 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461451 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.461477 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462541 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462623 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462704 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.462738 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463166 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463451 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463492 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463521 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463602 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463687 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.463759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.464337 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.467392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.468361 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.470888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.474620 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.488145 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.496585 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") pod \"cinder-api-0\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.565917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.565997 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566026 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566051 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566079 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566137 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566191 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566209 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566226 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.566243 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.567198 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.571645 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.574243 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.574387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.575797 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.576014 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.576420 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.576761 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.579023 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.595433 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") pod \"barbican-api-547b9d9588-5gkt7\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.598888 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") pod \"dnsmasq-dns-6c69c79c7f-fkwk8\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.692949 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.727317 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.747622 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.812326 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.822974 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:57 crc kubenswrapper[4962]: W0220 10:14:57.876656 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8794235c_580c_4874_94c2_3b28620e3fdb.slice/crio-9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d WatchSource:0}: Error finding container 9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d: Status 404 returned error can't find the container with id 9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d Feb 20 10:14:57 crc kubenswrapper[4962]: I0220 10:14:57.987616 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.085100 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.234819 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.360798 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:14:58 crc kubenswrapper[4962]: W0220 10:14:58.365164 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb97f91e3_f497_47ad_8d3d_f9945b3bdc34.slice/crio-a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115 WatchSource:0}: Error finding container a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115: Status 404 returned error can't find the container with id a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115 Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.391543 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.405199 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerStarted","Data":"a34d63171eeb032e506f3c3f6390187d10864d694aff1bd3157c782304896d3f"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.407333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerStarted","Data":"d28e07e3df71ca3ef7da3f055fa5cde6ea2cae0c5e5a865d321a0f8d0fb07b31"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.409637 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerStarted","Data":"0bed354fd9a98e89b5d38e5675524156eb0b61c69b251716c3b22a1d0bef6443"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.411191 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerStarted","Data":"a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.412696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerStarted","Data":"ce6724e81ded72c0c4f8c258e7503371ea527420f9524e9628d9189a535cf4b5"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.415433 4962 generic.go:334] "Generic (PLEG): container finished" podID="8794235c-580c-4874-94c2-3b28620e3fdb" containerID="5711558b18ca02e0aedc727fd6791cedd33bb082c0d7ab4780bc410966410664" exitCode=0 Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.415515 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" event={"ID":"8794235c-580c-4874-94c2-3b28620e3fdb","Type":"ContainerDied","Data":"5711558b18ca02e0aedc727fd6791cedd33bb082c0d7ab4780bc410966410664"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.415538 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" event={"ID":"8794235c-580c-4874-94c2-3b28620e3fdb","Type":"ContainerStarted","Data":"9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.418390 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerStarted","Data":"5638e382b35ddc2f5cb2cd42c5a2bc839053a009a82ec0299379e273d8965fd5"} Feb 20 10:14:58 crc kubenswrapper[4962]: I0220 10:14:58.926476 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.107569 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.107813 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.107879 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.107958 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.108061 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.108150 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") pod \"8794235c-580c-4874-94c2-3b28620e3fdb\" (UID: \"8794235c-580c-4874-94c2-3b28620e3fdb\") " Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.114753 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz" (OuterVolumeSpecName: "kube-api-access-nwhpz") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "kube-api-access-nwhpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.155494 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.155518 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.197471 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.197998 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210414 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210453 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210464 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210475 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwhpz\" (UniqueName: \"kubernetes.io/projected/8794235c-580c-4874-94c2-3b28620e3fdb-kube-api-access-nwhpz\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.210489 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.253531 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config" (OuterVolumeSpecName: "config") pod "8794235c-580c-4874-94c2-3b28620e3fdb" (UID: "8794235c-580c-4874-94c2-3b28620e3fdb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.315182 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8794235c-580c-4874-94c2-3b28620e3fdb-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.471815 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" event={"ID":"8794235c-580c-4874-94c2-3b28620e3fdb","Type":"ContainerDied","Data":"9a7108c116e44ebdb8f323de21aa0e098d1a9a742001d90947de3d921b977f5d"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.472290 4962 scope.go:117] "RemoveContainer" containerID="5711558b18ca02e0aedc727fd6791cedd33bb082c0d7ab4780bc410966410664" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.471877 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c549fb5d5-4c9w4" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.479135 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerStarted","Data":"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.484920 4962 generic.go:334] "Generic (PLEG): container finished" podID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerID="b058ae21e0210458ea10ea644bf00ea0438ea36899818d84b992ed449c70fc86" exitCode=0 Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.485019 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerDied","Data":"b058ae21e0210458ea10ea644bf00ea0438ea36899818d84b992ed449c70fc86"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.488129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerStarted","Data":"d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.488180 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerStarted","Data":"55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a"} Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.488687 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.489208 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.573664 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.589794 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c549fb5d5-4c9w4"] Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.596010 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-547b9d9588-5gkt7" podStartSLOduration=2.595985748 podStartE2EDuration="2.595985748s" podCreationTimestamp="2026-02-20 10:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:14:59.555959404 +0000 UTC m=+1191.138431250" watchObservedRunningTime="2026-02-20 10:14:59.595985748 +0000 UTC m=+1191.178457594" Feb 20 10:14:59 crc kubenswrapper[4962]: I0220 10:14:59.698646 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.141188 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 10:15:00 crc kubenswrapper[4962]: E0220 10:15:00.145132 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8794235c-580c-4874-94c2-3b28620e3fdb" containerName="init" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.145275 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8794235c-580c-4874-94c2-3b28620e3fdb" containerName="init" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.147232 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8794235c-580c-4874-94c2-3b28620e3fdb" containerName="init" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.148271 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.152742 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.162816 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.197513 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.243833 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.243957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.244066 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.345714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.345809 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.345865 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.347216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.351525 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.366372 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") pod \"collect-profiles-29526375-bzqgc\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.503301 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerStarted","Data":"f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f"} Feb 20 10:15:00 crc kubenswrapper[4962]: I0220 10:15:00.519885 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.088796 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 10:15:01 crc kubenswrapper[4962]: W0220 10:15:01.120608 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9c6a80_7747_461e_8f29_f371984a8c95.slice/crio-2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2 WatchSource:0}: Error finding container 2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2: Status 404 returned error can't find the container with id 2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2 Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.152071 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8794235c-580c-4874-94c2-3b28620e3fdb" path="/var/lib/kubelet/pods/8794235c-580c-4874-94c2-3b28620e3fdb/volumes" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.530323 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerStarted","Data":"8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.530986 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.544043 4962 generic.go:334] "Generic (PLEG): container finished" podID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerID="c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca" exitCode=0 Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.544099 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerDied","Data":"c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.558577 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" podStartSLOduration=4.558562283 podStartE2EDuration="4.558562283s" podCreationTimestamp="2026-02-20 10:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:01.554959612 +0000 UTC m=+1193.137431458" watchObservedRunningTime="2026-02-20 10:15:01.558562283 +0000 UTC m=+1193.141034129" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.569668 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" event={"ID":"fc9c6a80-7747-461e-8f29-f371984a8c95","Type":"ContainerStarted","Data":"c54639681debdeffda54130d89e4883eb7658c42414168fa95eba4479a2f093f"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.569727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" event={"ID":"fc9c6a80-7747-461e-8f29-f371984a8c95","Type":"ContainerStarted","Data":"2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.579267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerStarted","Data":"d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.579313 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerStarted","Data":"1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.581679 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.589393 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" podStartSLOduration=1.589371111 podStartE2EDuration="1.589371111s" podCreationTimestamp="2026-02-20 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:01.587687369 +0000 UTC m=+1193.170159225" watchObservedRunningTime="2026-02-20 10:15:01.589371111 +0000 UTC m=+1193.171842957" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.590446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerStarted","Data":"5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.590506 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerStarted","Data":"6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2"} Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.647173 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b8479d945-8wsh9" podStartSLOduration=3.061603032 podStartE2EDuration="5.647153159s" podCreationTimestamp="2026-02-20 10:14:56 +0000 UTC" firstStartedPulling="2026-02-20 10:14:58.01507318 +0000 UTC m=+1189.597545026" lastFinishedPulling="2026-02-20 10:15:00.600623287 +0000 UTC m=+1192.183095153" observedRunningTime="2026-02-20 10:15:01.642555125 +0000 UTC m=+1193.225026971" watchObservedRunningTime="2026-02-20 10:15:01.647153159 +0000 UTC m=+1193.229625005" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.671563 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" podStartSLOduration=3.22683698 podStartE2EDuration="5.671544567s" podCreationTimestamp="2026-02-20 10:14:56 +0000 UTC" firstStartedPulling="2026-02-20 10:14:58.098518665 +0000 UTC m=+1189.680990511" lastFinishedPulling="2026-02-20 10:15:00.543226252 +0000 UTC m=+1192.125698098" observedRunningTime="2026-02-20 10:15:01.664635952 +0000 UTC m=+1193.247107798" watchObservedRunningTime="2026-02-20 10:15:01.671544567 +0000 UTC m=+1193.254016413" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.686975 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687109 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687270 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687352 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687390 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687423 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.687488 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") pod \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\" (UID: \"f95ae2eb-8d20-4549-896d-e6991bfd1e06\") " Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.688617 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.689169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.689752 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.689774 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f95ae2eb-8d20-4549-896d-e6991bfd1e06-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.706401 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts" (OuterVolumeSpecName: "scripts") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.707489 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc" (OuterVolumeSpecName: "kube-api-access-bdggc") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "kube-api-access-bdggc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.730668 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.749381 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795335 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data" (OuterVolumeSpecName: "config-data") pod "f95ae2eb-8d20-4549-896d-e6991bfd1e06" (UID: "f95ae2eb-8d20-4549-896d-e6991bfd1e06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795754 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795865 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795881 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.795893 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdggc\" (UniqueName: \"kubernetes.io/projected/f95ae2eb-8d20-4549-896d-e6991bfd1e06-kube-api-access-bdggc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:01 crc kubenswrapper[4962]: I0220 10:15:01.897794 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95ae2eb-8d20-4549-896d-e6991bfd1e06-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.605781 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f95ae2eb-8d20-4549-896d-e6991bfd1e06","Type":"ContainerDied","Data":"15a399f778c9e29fd2309841da0d1240c31760f2d4754f8a287a27c0d443a8fb"} Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.605850 4962 scope.go:117] "RemoveContainer" containerID="64b0d198767190eec74d90a8a078975196c61e7b45117096eb7ad0a73bc18e8b" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.606053 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.616274 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerStarted","Data":"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5"} Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.616469 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api-log" containerID="cri-o://9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" gracePeriod=30 Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.616862 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.617213 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api" containerID="cri-o://e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" gracePeriod=30 Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.642517 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc9c6a80-7747-461e-8f29-f371984a8c95" containerID="c54639681debdeffda54130d89e4883eb7658c42414168fa95eba4479a2f093f" exitCode=0 Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.642635 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" event={"ID":"fc9c6a80-7747-461e-8f29-f371984a8c95","Type":"ContainerDied","Data":"c54639681debdeffda54130d89e4883eb7658c42414168fa95eba4479a2f093f"} Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.646274 4962 scope.go:117] "RemoveContainer" containerID="3c6ea7a933fafcb8f42d0dff736a28242f018ae0cea296d99186f98ada602b85" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.651666 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerStarted","Data":"565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501"} Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.675994 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.675970219 podStartE2EDuration="5.675970219s" podCreationTimestamp="2026-02-20 10:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:02.661959263 +0000 UTC m=+1194.244431109" watchObservedRunningTime="2026-02-20 10:15:02.675970219 +0000 UTC m=+1194.258442065" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.679798 4962 scope.go:117] "RemoveContainer" containerID="c820ae4ce289a934f94f300bfd5be2c53a94a21d5b0d1a615bd01eca018a9cca" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.777287 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.840170 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.902811 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:02 crc kubenswrapper[4962]: E0220 10:15:02.903676 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="sg-core" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.903721 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="sg-core" Feb 20 10:15:02 crc kubenswrapper[4962]: E0220 10:15:02.903759 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="ceilometer-notification-agent" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.903769 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="ceilometer-notification-agent" Feb 20 10:15:02 crc kubenswrapper[4962]: E0220 10:15:02.903807 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="proxy-httpd" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.903817 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="proxy-httpd" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.904080 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="sg-core" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.904123 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="proxy-httpd" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.904151 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" containerName="ceilometer-notification-agent" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.906916 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.035633728 podStartE2EDuration="6.9068921s" podCreationTimestamp="2026-02-20 10:14:56 +0000 UTC" firstStartedPulling="2026-02-20 10:14:57.85750593 +0000 UTC m=+1189.439977786" lastFinishedPulling="2026-02-20 10:14:58.728764302 +0000 UTC m=+1190.311236158" observedRunningTime="2026-02-20 10:15:02.77761738 +0000 UTC m=+1194.360089226" watchObservedRunningTime="2026-02-20 10:15:02.9068921 +0000 UTC m=+1194.489363946" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.907409 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.910147 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.910806 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:15:02 crc kubenswrapper[4962]: I0220 10:15:02.927994 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.027827 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.027915 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.027974 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.027999 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.029734 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.029813 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.029854 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131258 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131329 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131352 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131406 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.131986 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.132235 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.138548 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.139054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.139683 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.142955 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.161150 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") pod \"ceilometer-0\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.168039 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f95ae2eb-8d20-4549-896d-e6991bfd1e06" path="/var/lib/kubelet/pods/f95ae2eb-8d20-4549-896d-e6991bfd1e06/volumes" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.276496 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.469832 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.473110 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.481456 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.481801 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.516322 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.531753 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546000 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546036 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546079 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546099 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546177 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546287 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") pod \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\" (UID: \"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a\") " Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546607 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546658 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546705 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546727 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546745 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546779 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.546910 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.557923 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.558115 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs" (OuterVolumeSpecName: "logs") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.560955 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2" (OuterVolumeSpecName: "kube-api-access-fpkv2") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "kube-api-access-fpkv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.572036 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts" (OuterVolumeSpecName: "scripts") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.606923 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.621142 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data" (OuterVolumeSpecName: "config-data") pod "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" (UID: "ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647300 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647367 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647388 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647438 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647466 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647523 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647579 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647608 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647619 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647629 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647638 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647646 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.647654 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpkv2\" (UniqueName: \"kubernetes.io/projected/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a-kube-api-access-fpkv2\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.648370 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.652326 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.652464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.652497 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.652681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.653480 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.665710 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") pod \"barbican-api-84464996cb-fhnvz\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667041 4962 generic.go:334] "Generic (PLEG): container finished" podID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" exitCode=0 Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667069 4962 generic.go:334] "Generic (PLEG): container finished" podID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" exitCode=143 Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667303 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667680 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerDied","Data":"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5"} Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667734 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerDied","Data":"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f"} Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667748 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a","Type":"ContainerDied","Data":"5638e382b35ddc2f5cb2cd42c5a2bc839053a009a82ec0299379e273d8965fd5"} Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.667767 4962 scope.go:117] "RemoveContainer" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.706382 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.712760 4962 scope.go:117] "RemoveContainer" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.716858 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.731232 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: E0220 10:15:03.731632 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.731649 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api" Feb 20 10:15:03 crc kubenswrapper[4962]: E0220 10:15:03.731710 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api-log" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.731720 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api-log" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.741711 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api-log" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.741767 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" containerName="cinder-api" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.748791 4962 scope.go:117] "RemoveContainer" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" Feb 20 10:15:03 crc kubenswrapper[4962]: E0220 10:15:03.749669 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": container with ID starting with e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5 not found: ID does not exist" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.749706 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5"} err="failed to get container status \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": rpc error: code = NotFound desc = could not find container \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": container with ID starting with e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5 not found: ID does not exist" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.749728 4962 scope.go:117] "RemoveContainer" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" Feb 20 10:15:03 crc kubenswrapper[4962]: E0220 10:15:03.750422 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": container with ID starting with 9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f not found: ID does not exist" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.750454 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f"} err="failed to get container status \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": rpc error: code = NotFound desc = could not find container \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": container with ID starting with 9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f not found: ID does not exist" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.750473 4962 scope.go:117] "RemoveContainer" containerID="e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.750830 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5"} err="failed to get container status \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": rpc error: code = NotFound desc = could not find container \"e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5\": container with ID starting with e1875a294970ba9c5eec28ab756d0f2eb5f8d71368f1ffb2266761ff9bd0fad5 not found: ID does not exist" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.750853 4962 scope.go:117] "RemoveContainer" containerID="9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.751824 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f"} err="failed to get container status \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": rpc error: code = NotFound desc = could not find container \"9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f\": container with ID starting with 9b384b310279a03d71974057d02907624e67626affe4012340729465b691bd7f not found: ID does not exist" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.752551 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.752706 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.758171 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.758292 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.758293 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.815494 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858774 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858861 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858909 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858952 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.858992 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.859035 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.859075 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.859097 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.859152 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.929944 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:03 crc kubenswrapper[4962]: W0220 10:15:03.962742 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18d02cdb_5de5_457e_9f17_1cc3ba51ca55.slice/crio-26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e WatchSource:0}: Error finding container 26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e: Status 404 returned error can't find the container with id 26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964453 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964575 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964626 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964671 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964714 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964783 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.964817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.967377 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.967787 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.974054 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.974176 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.976087 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.976727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.978003 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.981477 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:03 crc kubenswrapper[4962]: I0220 10:15:03.985844 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") pod \"cinder-api-0\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " pod="openstack/cinder-api-0" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.083283 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.143858 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.168086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") pod \"fc9c6a80-7747-461e-8f29-f371984a8c95\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.168153 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") pod \"fc9c6a80-7747-461e-8f29-f371984a8c95\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.168253 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") pod \"fc9c6a80-7747-461e-8f29-f371984a8c95\" (UID: \"fc9c6a80-7747-461e-8f29-f371984a8c95\") " Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.169516 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume" (OuterVolumeSpecName: "config-volume") pod "fc9c6a80-7747-461e-8f29-f371984a8c95" (UID: "fc9c6a80-7747-461e-8f29-f371984a8c95"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.173575 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk" (OuterVolumeSpecName: "kube-api-access-lnnhk") pod "fc9c6a80-7747-461e-8f29-f371984a8c95" (UID: "fc9c6a80-7747-461e-8f29-f371984a8c95"). InnerVolumeSpecName "kube-api-access-lnnhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.174496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fc9c6a80-7747-461e-8f29-f371984a8c95" (UID: "fc9c6a80-7747-461e-8f29-f371984a8c95"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.273391 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc9c6a80-7747-461e-8f29-f371984a8c95-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.273438 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnnhk\" (UniqueName: \"kubernetes.io/projected/fc9c6a80-7747-461e-8f29-f371984a8c95-kube-api-access-lnnhk\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.273455 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc9c6a80-7747-461e-8f29-f371984a8c95-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.351896 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:15:04 crc kubenswrapper[4962]: W0220 10:15:04.357547 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10c1a487_1a74_4994_9b39_f05cbe0fa5c7.slice/crio-3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13 WatchSource:0}: Error finding container 3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13: Status 404 returned error can't find the container with id 3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13 Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.627540 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:15:04 crc kubenswrapper[4962]: W0220 10:15:04.676943 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89dbdc4c_bf31_402e_b5bf_e8bbb8c16172.slice/crio-6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d WatchSource:0}: Error finding container 6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d: Status 404 returned error can't find the container with id 6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.705700 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerStarted","Data":"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.705754 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerStarted","Data":"3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.713379 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.713410 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.718917 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" event={"ID":"fc9c6a80-7747-461e-8f29-f371984a8c95","Type":"ContainerDied","Data":"2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2"} Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.718967 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2119c2480595ca04113ba775f3754c44199c13718d123a19bb5c127f2e2e0ad2" Feb 20 10:15:04 crc kubenswrapper[4962]: I0220 10:15:04.719029 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.161275 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a" path="/var/lib/kubelet/pods/ad24b407-dbbd-4177-9d4f-1f02ef5f7a4a/volumes" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.275119 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.719228 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.720075 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-747dfbc745-ndpzt" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-api" containerID="cri-o://ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441" gracePeriod=30 Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.721152 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-747dfbc745-ndpzt" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" containerID="cri-o://bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f" gracePeriod=30 Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.729064 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-747dfbc745-ndpzt" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": EOF" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.744683 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:15:05 crc kubenswrapper[4962]: E0220 10:15:05.745253 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9c6a80-7747-461e-8f29-f371984a8c95" containerName="collect-profiles" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.745275 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9c6a80-7747-461e-8f29-f371984a8c95" containerName="collect-profiles" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.745471 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9c6a80-7747-461e-8f29-f371984a8c95" containerName="collect-profiles" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.746815 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.752697 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.787690 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f"} Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.801013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerStarted","Data":"aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b"} Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.801068 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerStarted","Data":"6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d"} Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.815142 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerStarted","Data":"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515"} Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.815624 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.815710 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.855460 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84464996cb-fhnvz" podStartSLOduration=2.855440544 podStartE2EDuration="2.855440544s" podCreationTimestamp="2026-02-20 10:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:05.844361419 +0000 UTC m=+1197.426833265" watchObservedRunningTime="2026-02-20 10:15:05.855440544 +0000 UTC m=+1197.437912390" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920362 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920471 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920563 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920638 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920664 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920741 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:05 crc kubenswrapper[4962]: I0220 10:15:05.920807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023054 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023139 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023175 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023192 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023239 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023264 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.023303 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.029484 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.030785 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.031088 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.031276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.034155 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.049471 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.056178 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") pod \"neutron-5dfd6b5f7f-dkfsl\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.146007 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.827206 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be"} Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.831394 4962 generic.go:334] "Generic (PLEG): container finished" podID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerID="bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f" exitCode=0 Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.831489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerDied","Data":"bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f"} Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.850484 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerStarted","Data":"7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48"} Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.897683 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.897661221 podStartE2EDuration="3.897661221s" podCreationTimestamp="2026-02-20 10:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:06.872638813 +0000 UTC m=+1198.455110669" watchObservedRunningTime="2026-02-20 10:15:06.897661221 +0000 UTC m=+1198.480133067" Feb 20 10:15:06 crc kubenswrapper[4962]: W0220 10:15:06.983556 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6ce3f9c_b8d2_4c53_a494_3aa01ec4f9b3.slice/crio-4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411 WatchSource:0}: Error finding container 4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411: Status 404 returned error can't find the container with id 4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411 Feb 20 10:15:06 crc kubenswrapper[4962]: I0220 10:15:06.986448 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.022633 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.287631 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.729936 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.800703 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.801023 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="dnsmasq-dns" containerID="cri-o://04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" gracePeriod=10 Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878103 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerStarted","Data":"731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f"} Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerStarted","Data":"45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3"} Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878289 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerStarted","Data":"4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411"} Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878424 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.878903 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.934031 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5dfd6b5f7f-dkfsl" podStartSLOduration=2.934014337 podStartE2EDuration="2.934014337s" podCreationTimestamp="2026-02-20 10:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:07.928420883 +0000 UTC m=+1199.510892729" watchObservedRunningTime="2026-02-20 10:15:07.934014337 +0000 UTC m=+1199.516486183" Feb 20 10:15:07 crc kubenswrapper[4962]: I0220 10:15:07.935760 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.421971 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589145 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589256 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589315 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589459 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589609 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.589725 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") pod \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\" (UID: \"54637a9a-7f3e-439e-adf0-ba5b33a539d3\") " Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.602160 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r" (OuterVolumeSpecName: "kube-api-access-w864r") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "kube-api-access-w864r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.649484 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config" (OuterVolumeSpecName: "config") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.653398 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.659743 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.666373 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.667527 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "54637a9a-7f3e-439e-adf0-ba5b33a539d3" (UID: "54637a9a-7f3e-439e-adf0-ba5b33a539d3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695316 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w864r\" (UniqueName: \"kubernetes.io/projected/54637a9a-7f3e-439e-adf0-ba5b33a539d3-kube-api-access-w864r\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695361 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695372 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695384 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695398 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.695412 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54637a9a-7f3e-439e-adf0-ba5b33a539d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.735962 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-747dfbc745-ndpzt" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": dial tcp 10.217.0.152:9696: connect: connection refused" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.890042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerStarted","Data":"eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606"} Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.890279 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892259 4962 generic.go:334] "Generic (PLEG): container finished" podID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerID="04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" exitCode=0 Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892321 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892381 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerDied","Data":"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563"} Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-9hhv4" event={"ID":"54637a9a-7f3e-439e-adf0-ba5b33a539d3","Type":"ContainerDied","Data":"48cb63ddc063927b98c0acd0bd8342b9608beaa369b2e5045eda23591cd6bfd4"} Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.892490 4962 scope.go:117] "RemoveContainer" containerID="04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.893906 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="cinder-scheduler" containerID="cri-o://f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f" gracePeriod=30 Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.893971 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="probe" containerID="cri-o://565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501" gracePeriod=30 Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.921802 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.191846969 podStartE2EDuration="6.92177431s" podCreationTimestamp="2026-02-20 10:15:02 +0000 UTC" firstStartedPulling="2026-02-20 10:15:03.966272201 +0000 UTC m=+1195.548744047" lastFinishedPulling="2026-02-20 10:15:07.696199542 +0000 UTC m=+1199.278671388" observedRunningTime="2026-02-20 10:15:08.919037235 +0000 UTC m=+1200.501509091" watchObservedRunningTime="2026-02-20 10:15:08.92177431 +0000 UTC m=+1200.504246156" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.929657 4962 scope.go:117] "RemoveContainer" containerID="ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.959635 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.970746 4962 scope.go:117] "RemoveContainer" containerID="04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" Feb 20 10:15:08 crc kubenswrapper[4962]: E0220 10:15:08.971497 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563\": container with ID starting with 04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563 not found: ID does not exist" containerID="04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.971560 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563"} err="failed to get container status \"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563\": rpc error: code = NotFound desc = could not find container \"04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563\": container with ID starting with 04e63c3619f61031cb4ae8e56eba07f9d2e30a9dee2c65fe9e821777ffa0e563 not found: ID does not exist" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.971652 4962 scope.go:117] "RemoveContainer" containerID="ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd" Feb 20 10:15:08 crc kubenswrapper[4962]: E0220 10:15:08.972206 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd\": container with ID starting with ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd not found: ID does not exist" containerID="ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.972230 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd"} err="failed to get container status \"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd\": rpc error: code = NotFound desc = could not find container \"ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd\": container with ID starting with ed98f94bfdead02504eed65ca99332ec23131a2035d01ac3b9510e63c1aa80cd not found: ID does not exist" Feb 20 10:15:08 crc kubenswrapper[4962]: I0220 10:15:08.975079 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-9hhv4"] Feb 20 10:15:09 crc kubenswrapper[4962]: I0220 10:15:09.179890 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" path="/var/lib/kubelet/pods/54637a9a-7f3e-439e-adf0-ba5b33a539d3/volumes" Feb 20 10:15:09 crc kubenswrapper[4962]: I0220 10:15:09.582036 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:15:09 crc kubenswrapper[4962]: I0220 10:15:09.628190 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:15:10 crc kubenswrapper[4962]: I0220 10:15:10.973435 4962 generic.go:334] "Generic (PLEG): container finished" podID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerID="ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441" exitCode=0 Feb 20 10:15:10 crc kubenswrapper[4962]: I0220 10:15:10.973628 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerDied","Data":"ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441"} Feb 20 10:15:10 crc kubenswrapper[4962]: I0220 10:15:10.981823 4962 generic.go:334] "Generic (PLEG): container finished" podID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerID="565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501" exitCode=0 Feb 20 10:15:10 crc kubenswrapper[4962]: I0220 10:15:10.981874 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerDied","Data":"565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501"} Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.538033 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575339 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575417 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575509 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575558 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575722 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575786 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.575840 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") pod \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\" (UID: \"4839dc9e-3bbd-48e3-b839-40929e67ce7a\") " Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.585204 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5" (OuterVolumeSpecName: "kube-api-access-62jg5") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "kube-api-access-62jg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.586675 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.633928 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config" (OuterVolumeSpecName: "config") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.637155 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.646219 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.663768 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677285 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677345 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677356 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677368 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677377 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62jg5\" (UniqueName: \"kubernetes.io/projected/4839dc9e-3bbd-48e3-b839-40929e67ce7a-kube-api-access-62jg5\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.677391 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.679563 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4839dc9e-3bbd-48e3-b839-40929e67ce7a" (UID: "4839dc9e-3bbd-48e3-b839-40929e67ce7a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:11 crc kubenswrapper[4962]: I0220 10:15:11.779023 4962 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4839dc9e-3bbd-48e3-b839-40929e67ce7a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.043231 4962 generic.go:334] "Generic (PLEG): container finished" podID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerID="f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f" exitCode=0 Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.043384 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerDied","Data":"f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f"} Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.049181 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-747dfbc745-ndpzt" event={"ID":"4839dc9e-3bbd-48e3-b839-40929e67ce7a","Type":"ContainerDied","Data":"9861804d9a2a53f03b608fd61261417b654233e3abb2bbc1678d9e37df3e329e"} Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.049314 4962 scope.go:117] "RemoveContainer" containerID="bae43663ef81835d7b00f29e6ed99c794aa21733e839a0ebe0e90aee6573888f" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.049559 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-747dfbc745-ndpzt" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.107482 4962 scope.go:117] "RemoveContainer" containerID="ef8879302adbb806976eb02b6043e51e7d9091d75bf4488fcea19123656b2441" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.120160 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.127352 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-747dfbc745-ndpzt"] Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.339182 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.496146 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.496737 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497303 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497628 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.497772 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") pod \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\" (UID: \"51d56dfc-4e59-4c3d-b26d-a06301f274c8\") " Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.498975 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d56dfc-4e59-4c3d-b26d-a06301f274c8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.504757 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts" (OuterVolumeSpecName: "scripts") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.504920 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl" (OuterVolumeSpecName: "kube-api-access-mrsbl") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "kube-api-access-mrsbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.505837 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.575430 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.601426 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.601484 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.601496 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrsbl\" (UniqueName: \"kubernetes.io/projected/51d56dfc-4e59-4c3d-b26d-a06301f274c8-kube-api-access-mrsbl\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.601510 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.643848 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data" (OuterVolumeSpecName: "config-data") pod "51d56dfc-4e59-4c3d-b26d-a06301f274c8" (UID: "51d56dfc-4e59-4c3d-b26d-a06301f274c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.703744 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d56dfc-4e59-4c3d-b26d-a06301f274c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.877110 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:15:12 crc kubenswrapper[4962]: I0220 10:15:12.881014 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.063445 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"51d56dfc-4e59-4c3d-b26d-a06301f274c8","Type":"ContainerDied","Data":"d28e07e3df71ca3ef7da3f055fa5cde6ea2cae0c5e5a865d321a0f8d0fb07b31"} Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.063516 4962 scope.go:117] "RemoveContainer" containerID="565233d9beb95735adc25c4668daa315369653b7a2cdf6734ddda82b44ff3501" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.063658 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.102609 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.110133 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.123808 4962 scope.go:117] "RemoveContainer" containerID="f526ad02da5a4878737da129e8a712cb1847bfbea7fa32b3816c5e883fd3a61f" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.135144 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.136364 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="cinder-scheduler" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.136446 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="cinder-scheduler" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.136511 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.136564 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.136657 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="init" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.136717 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="init" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.136774 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="probe" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.136827 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="probe" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.138140 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-api" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138251 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-api" Feb 20 10:15:13 crc kubenswrapper[4962]: E0220 10:15:13.138341 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="dnsmasq-dns" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138400 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="dnsmasq-dns" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138742 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="probe" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138827 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="54637a9a-7f3e-439e-adf0-ba5b33a539d3" containerName="dnsmasq-dns" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138889 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-api" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.138945 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" containerName="neutron-httpd" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.139009 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" containerName="cinder-scheduler" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.140325 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.153511 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.158493 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4839dc9e-3bbd-48e3-b839-40929e67ce7a" path="/var/lib/kubelet/pods/4839dc9e-3bbd-48e3-b839-40929e67ce7a/volumes" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.159228 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d56dfc-4e59-4c3d-b26d-a06301f274c8" path="/var/lib/kubelet/pods/51d56dfc-4e59-4c3d-b26d-a06301f274c8/volumes" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.160000 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.284859 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.287503 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.297889 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324186 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324254 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324333 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324352 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.324384 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.426322 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.426400 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427250 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427353 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427397 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427507 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427548 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427663 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427817 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427852 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.427988 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.435973 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.436075 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.438130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.439165 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.443884 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") pod \"cinder-scheduler-0\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.474971 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.530798 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.530932 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.530972 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.531078 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.531116 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.531156 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.531313 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.533614 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.538162 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.539075 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.542940 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.550389 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.554752 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.556554 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") pod \"placement-687f4cff74-gmh4w\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:13 crc kubenswrapper[4962]: I0220 10:15:13.646311 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:14 crc kubenswrapper[4962]: I0220 10:15:14.056111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:15:14 crc kubenswrapper[4962]: I0220 10:15:14.100051 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerStarted","Data":"25d2258a03970a75594e5384f741d4a8aaad9e37d3b0b7c512e80fa795dc3283"} Feb 20 10:15:14 crc kubenswrapper[4962]: I0220 10:15:14.138615 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.127481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerStarted","Data":"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548"} Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.127885 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerStarted","Data":"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53"} Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.127934 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.127976 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.133984 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerStarted","Data":"f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1"} Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.134046 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerStarted","Data":"b094901d04b6844ae7ff61500f6dbd375cab8bf6c8a00346003f62e1a980cada"} Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.156727 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-687f4cff74-gmh4w" podStartSLOduration=2.156705944 podStartE2EDuration="2.156705944s" podCreationTimestamp="2026-02-20 10:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:15.152146762 +0000 UTC m=+1206.734618608" watchObservedRunningTime="2026-02-20 10:15:15.156705944 +0000 UTC m=+1206.739177790" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.453304 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.481079 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.588929 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.589606 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-547b9d9588-5gkt7" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" containerID="cri-o://55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a" gracePeriod=30 Feb 20 10:15:15 crc kubenswrapper[4962]: I0220 10:15:15.589829 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-547b9d9588-5gkt7" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" containerID="cri-o://d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec" gracePeriod=30 Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.159260 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerStarted","Data":"c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def"} Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.186649 4962 generic.go:334] "Generic (PLEG): container finished" podID="2320213f-c3b3-4074-95f9-ad86446193b3" containerID="55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a" exitCode=143 Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.187945 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerDied","Data":"55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a"} Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.191515 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.1914975 podStartE2EDuration="3.1914975s" podCreationTimestamp="2026-02-20 10:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:16.186414212 +0000 UTC m=+1207.768886058" watchObservedRunningTime="2026-02-20 10:15:16.1914975 +0000 UTC m=+1207.773969346" Feb 20 10:15:16 crc kubenswrapper[4962]: I0220 10:15:16.308974 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 10:15:18 crc kubenswrapper[4962]: I0220 10:15:18.452954 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:15:18 crc kubenswrapper[4962]: I0220 10:15:18.475327 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 10:15:18 crc kubenswrapper[4962]: I0220 10:15:18.805233 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-547b9d9588-5gkt7" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:44316->10.217.0.161:9311: read: connection reset by peer" Feb 20 10:15:18 crc kubenswrapper[4962]: I0220 10:15:18.805216 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-547b9d9588-5gkt7" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:44312->10.217.0.161:9311: read: connection reset by peer" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.224929 4962 generic.go:334] "Generic (PLEG): container finished" podID="2320213f-c3b3-4074-95f9-ad86446193b3" containerID="d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec" exitCode=0 Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.224997 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerDied","Data":"d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec"} Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.321524 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.515757 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.515980 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.516045 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.516162 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.516329 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") pod \"2320213f-c3b3-4074-95f9-ad86446193b3\" (UID: \"2320213f-c3b3-4074-95f9-ad86446193b3\") " Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.516924 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs" (OuterVolumeSpecName: "logs") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.537103 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd" (OuterVolumeSpecName: "kube-api-access-pvtmd") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "kube-api-access-pvtmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.537284 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.561714 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.576032 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data" (OuterVolumeSpecName: "config-data") pod "2320213f-c3b3-4074-95f9-ad86446193b3" (UID: "2320213f-c3b3-4074-95f9-ad86446193b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618709 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618747 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2320213f-c3b3-4074-95f9-ad86446193b3-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618762 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618778 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2320213f-c3b3-4074-95f9-ad86446193b3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.618793 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvtmd\" (UniqueName: \"kubernetes.io/projected/2320213f-c3b3-4074-95f9-ad86446193b3-kube-api-access-pvtmd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.995754 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 10:15:19 crc kubenswrapper[4962]: E0220 10:15:19.996910 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.997105 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" Feb 20 10:15:19 crc kubenswrapper[4962]: E0220 10:15:19.997334 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.997518 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.998068 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api" Feb 20 10:15:19 crc kubenswrapper[4962]: I0220 10:15:19.998264 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" containerName="barbican-api-log" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.000044 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.005067 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.005085 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-s8k4c" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.005491 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.037268 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.128006 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.128414 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.128576 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.130336 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.234741 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.235379 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.236697 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.236927 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.239642 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.246912 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.249698 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547b9d9588-5gkt7" event={"ID":"2320213f-c3b3-4074-95f9-ad86446193b3","Type":"ContainerDied","Data":"ce6724e81ded72c0c4f8c258e7503371ea527420f9524e9628d9189a535cf4b5"} Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.249753 4962 scope.go:117] "RemoveContainer" containerID="d4bd6e492dd1bc7584580c3e1bc6a4f7a66a1d7156602f3795865e0b336514ec" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.249962 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547b9d9588-5gkt7" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.257704 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.271324 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") pod \"openstackclient\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.329780 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.468246 4962 scope.go:117] "RemoveContainer" containerID="55486fc199e4a3a4fb67874630324c07abf5ab0280be238e62e377607c20060a" Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.475039 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.501411 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-547b9d9588-5gkt7"] Feb 20 10:15:20 crc kubenswrapper[4962]: I0220 10:15:20.884128 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 10:15:20 crc kubenswrapper[4962]: W0220 10:15:20.894231 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod755ca463_8c62_402c_8a88_a066fb38b521.slice/crio-f9306bfc7bc5868c5a90d617919c8c8dae64403806b069b14336f9858d5a16eb WatchSource:0}: Error finding container f9306bfc7bc5868c5a90d617919c8c8dae64403806b069b14336f9858d5a16eb: Status 404 returned error can't find the container with id f9306bfc7bc5868c5a90d617919c8c8dae64403806b069b14336f9858d5a16eb Feb 20 10:15:21 crc kubenswrapper[4962]: I0220 10:15:21.169476 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2320213f-c3b3-4074-95f9-ad86446193b3" path="/var/lib/kubelet/pods/2320213f-c3b3-4074-95f9-ad86446193b3/volumes" Feb 20 10:15:21 crc kubenswrapper[4962]: I0220 10:15:21.261988 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"755ca463-8c62-402c-8a88-a066fb38b521","Type":"ContainerStarted","Data":"f9306bfc7bc5868c5a90d617919c8c8dae64403806b069b14336f9858d5a16eb"} Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.203813 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.205838 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.208855 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.208864 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.211384 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.222071 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.230495 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.230974 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231187 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231457 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231780 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.231862 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.334158 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.334271 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335462 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335489 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335519 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335546 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.335588 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.336559 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.336936 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.341423 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.341748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.341952 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.343118 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.347289 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.352563 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") pod \"swift-proxy-5b685f5b9-4db6w\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.550367 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:23 crc kubenswrapper[4962]: I0220 10:15:23.768898 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.004473 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:15:24 crc kubenswrapper[4962]: W0220 10:15:24.010579 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod559addbd_1bc6_4146_9a27_ce3e1d3d08fd.slice/crio-cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73 WatchSource:0}: Error finding container cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73: Status 404 returned error can't find the container with id cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.095886 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.096655 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-central-agent" containerID="cri-o://eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84" gracePeriod=30 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.096794 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="sg-core" containerID="cri-o://3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be" gracePeriod=30 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.096856 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-notification-agent" containerID="cri-o://884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f" gracePeriod=30 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.097327 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" containerID="cri-o://eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606" gracePeriod=30 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.109583 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.163:3000/\": EOF" Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.310104 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerStarted","Data":"68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9"} Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.310164 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerStarted","Data":"cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73"} Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.318319 4962 generic.go:334] "Generic (PLEG): container finished" podID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerID="3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be" exitCode=2 Feb 20 10:15:24 crc kubenswrapper[4962]: I0220 10:15:24.318393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be"} Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.334666 4962 generic.go:334] "Generic (PLEG): container finished" podID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerID="eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606" exitCode=0 Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.335045 4962 generic.go:334] "Generic (PLEG): container finished" podID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerID="eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84" exitCode=0 Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.334728 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606"} Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.335108 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84"} Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.338222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerStarted","Data":"42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632"} Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.338493 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.338512 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.367681 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5b685f5b9-4db6w" podStartSLOduration=2.367445644 podStartE2EDuration="2.367445644s" podCreationTimestamp="2026-02-20 10:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:25.363074467 +0000 UTC m=+1216.945546323" watchObservedRunningTime="2026-02-20 10:15:25.367445644 +0000 UTC m=+1216.949917480" Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.402463 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.404650 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-log" containerID="cri-o://eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9" gracePeriod=30 Feb 20 10:15:25 crc kubenswrapper[4962]: I0220 10:15:25.404684 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-httpd" containerID="cri-o://d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f" gracePeriod=30 Feb 20 10:15:26 crc kubenswrapper[4962]: I0220 10:15:26.366405 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerID="eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9" exitCode=143 Feb 20 10:15:26 crc kubenswrapper[4962]: I0220 10:15:26.366691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerDied","Data":"eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9"} Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.395379 4962 generic.go:334] "Generic (PLEG): container finished" podID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerID="884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f" exitCode=0 Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.396670 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f"} Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.410540 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.410901 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-log" containerID="cri-o://888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d" gracePeriod=30 Feb 20 10:15:27 crc kubenswrapper[4962]: I0220 10:15:27.411098 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-httpd" containerID="cri-o://ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0" gracePeriod=30 Feb 20 10:15:28 crc kubenswrapper[4962]: I0220 10:15:28.409700 4962 generic.go:334] "Generic (PLEG): container finished" podID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerID="888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d" exitCode=143 Feb 20 10:15:28 crc kubenswrapper[4962]: I0220 10:15:28.410542 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerDied","Data":"888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d"} Feb 20 10:15:29 crc kubenswrapper[4962]: I0220 10:15:29.423644 4962 generic.go:334] "Generic (PLEG): container finished" podID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerID="d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f" exitCode=0 Feb 20 10:15:29 crc kubenswrapper[4962]: I0220 10:15:29.423810 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerDied","Data":"d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f"} Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.474552 4962 generic.go:334] "Generic (PLEG): container finished" podID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerID="ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0" exitCode=0 Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.474881 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerDied","Data":"ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0"} Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.584454 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.744560 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.749829 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.749873 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.749929 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.750117 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.750201 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.750237 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") pod \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\" (UID: \"18d02cdb-5de5-457e-9f17-1cc3ba51ca55\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.751016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.751267 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.751312 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.753187 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s" (OuterVolumeSpecName: "kube-api-access-mg88s") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "kube-api-access-mg88s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.755653 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts" (OuterVolumeSpecName: "scripts") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.777650 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.792292 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.850737 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.853750 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.853910 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.854680 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.854696 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg88s\" (UniqueName: \"kubernetes.io/projected/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-kube-api-access-mg88s\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.854707 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.854718 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.859563 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.860504 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.860748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92" (OuterVolumeSpecName: "kube-api-access-smt92") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "kube-api-access-smt92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.950438 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data" (OuterVolumeSpecName: "config-data") pod "18d02cdb-5de5-457e-9f17-1cc3ba51ca55" (UID: "18d02cdb-5de5-457e-9f17-1cc3ba51ca55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.955676 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.955835 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.955932 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.955997 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956077 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956113 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956196 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956230 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956279 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956309 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956330 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") pod \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\" (UID: \"5d49a0b0-18e1-4701-9a94-5ff22700ffdf\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956364 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs" (OuterVolumeSpecName: "logs") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956878 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.956981 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs" (OuterVolumeSpecName: "logs") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957580 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957615 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957625 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18d02cdb-5de5-457e-9f17-1cc3ba51ca55-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957636 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smt92\" (UniqueName: \"kubernetes.io/projected/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-kube-api-access-smt92\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957649 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957671 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.957680 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.963682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts" (OuterVolumeSpecName: "scripts") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.964434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.972734 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts" (OuterVolumeSpecName: "scripts") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:31 crc kubenswrapper[4962]: I0220 10:15:31.990136 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.001172 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.032579 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.036043 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data" (OuterVolumeSpecName: "config-data") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.036289 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d49a0b0-18e1-4701-9a94-5ff22700ffdf" (UID: "5d49a0b0-18e1-4701-9a94-5ff22700ffdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.047050 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.050422 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data" (OuterVolumeSpecName: "config-data") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.059578 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.059735 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\" (UID: \"eb8bed08-cb47-42cb-a192-2545a14e4c4b\") " Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060478 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060506 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060519 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060532 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060543 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060552 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060566 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060578 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d49a0b0-18e1-4701-9a94-5ff22700ffdf-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060604 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb8bed08-cb47-42cb-a192-2545a14e4c4b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.060615 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8bed08-cb47-42cb-a192-2545a14e4c4b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.063235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj" (OuterVolumeSpecName: "kube-api-access-xqshj") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "kube-api-access-xqshj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.063537 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "eb8bed08-cb47-42cb-a192-2545a14e4c4b" (UID: "eb8bed08-cb47-42cb-a192-2545a14e4c4b"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.164927 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqshj\" (UniqueName: \"kubernetes.io/projected/eb8bed08-cb47-42cb-a192-2545a14e4c4b-kube-api-access-xqshj\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.164993 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.186530 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.267852 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.487964 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"755ca463-8c62-402c-8a88-a066fb38b521","Type":"ContainerStarted","Data":"58314faa8bcfe5f5f7afbcc99e392370d5f2737c5567814db10eda41512d6621"} Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.491304 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d49a0b0-18e1-4701-9a94-5ff22700ffdf","Type":"ContainerDied","Data":"80442f73d0bb0b1518e80cca1be32921f22e35f7b8a9442cfe4b3a67ae521feb"} Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.491356 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.491383 4962 scope.go:117] "RemoveContainer" containerID="d20fc2b7dd54adff0d815d504246ad4e77027f8694190be99bf82bc96b1f4c9f" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.496280 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eb8bed08-cb47-42cb-a192-2545a14e4c4b","Type":"ContainerDied","Data":"9040b55e8e99f198b86e6b7541c702ac840e9ec2debbf8648bac866cfdc48248"} Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.496356 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.500674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18d02cdb-5de5-457e-9f17-1cc3ba51ca55","Type":"ContainerDied","Data":"26bc9af9660938fb18d75442b06d47f88d7b3c3c743cee58b4e39774f040ef5e"} Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.500883 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.510904 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.004548925 podStartE2EDuration="13.510880366s" podCreationTimestamp="2026-02-20 10:15:19 +0000 UTC" firstStartedPulling="2026-02-20 10:15:20.898174433 +0000 UTC m=+1212.480646279" lastFinishedPulling="2026-02-20 10:15:31.404505874 +0000 UTC m=+1222.986977720" observedRunningTime="2026-02-20 10:15:32.50811539 +0000 UTC m=+1224.090587236" watchObservedRunningTime="2026-02-20 10:15:32.510880366 +0000 UTC m=+1224.093352232" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.533433 4962 scope.go:117] "RemoveContainer" containerID="eee28e4c70ffa00bd2365deec6d15fc3972d99a3f4797b06e78feaf11cf564f9" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.566988 4962 scope.go:117] "RemoveContainer" containerID="ed6b084d34e4657f5b3865a48d0866534c9fe6dd73d3021c88e80799cfa08dc0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.576070 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.593497 4962 scope.go:117] "RemoveContainer" containerID="888c0df412b752895b61127294d75f746f54944c18df1a5600dd20d1b268288d" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.595906 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.611353 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.623898 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.631892 4962 scope.go:117] "RemoveContainer" containerID="eb0e72879a25aa9037cdc6292f11db919678280642193cd3d3aba5fbe6ccd606" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.634776 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.655793 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.658952 4962 scope.go:117] "RemoveContainer" containerID="3dfd1ca1a6fa866b05ee6afe9233f010951bb6a3a4b6711a05320da1e6e882be" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666153 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666708 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="sg-core" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666725 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="sg-core" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666737 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666744 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666756 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666764 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666784 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-central-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666790 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-central-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666819 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666824 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666831 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-notification-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666839 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-notification-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666856 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666861 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: E0220 10:15:32.666873 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.666881 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667146 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-central-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667167 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667178 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="ceilometer-notification-agent" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667191 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="sg-core" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667200 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667214 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" containerName="proxy-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667223 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" containerName="glance-log" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.667230 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" containerName="glance-httpd" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.668656 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.669924 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.671126 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.671448 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lzhn7" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.671766 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.675935 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.683072 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.686178 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.686362 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.691194 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.691716 4962 scope.go:117] "RemoveContainer" containerID="884bc1c8181b4c262029886834ab81b7b4d9ec6c2cb240fa256637239957413f" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.710761 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.731763 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.733537 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.736821 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.737058 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.747654 4962 scope.go:117] "RemoveContainer" containerID="eadd6481e402308012bf3cb666163e04ad34b5f97c3780834cf912ecffa0bc84" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.756807 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.778919 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.778975 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779008 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779031 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779060 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779101 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779118 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779147 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779168 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779191 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779224 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779243 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779258 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779275 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.779298 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.881986 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882078 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882138 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882160 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882188 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882212 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882804 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882830 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882883 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882905 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882920 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882937 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882978 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883022 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883058 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883073 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883101 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883123 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883151 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883175 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.883702 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.882762 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.884349 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.885011 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.885259 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.889424 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.889798 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.890008 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.890179 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.892395 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.899942 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.904588 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.910094 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " pod="openstack/ceilometer-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.911338 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.914996 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.923583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " pod="openstack/glance-default-internal-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985658 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985735 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985770 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985791 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985913 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985934 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.985965 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.987568 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.988151 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.990434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.990896 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:32 crc kubenswrapper[4962]: I0220 10:15:32.994276 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.001502 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.001661 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.002228 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.009221 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.011249 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.029785 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.056110 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.162123 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d02cdb-5de5-457e-9f17-1cc3ba51ca55" path="/var/lib/kubelet/pods/18d02cdb-5de5-457e-9f17-1cc3ba51ca55/volumes" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.163463 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d49a0b0-18e1-4701-9a94-5ff22700ffdf" path="/var/lib/kubelet/pods/5d49a0b0-18e1-4701-9a94-5ff22700ffdf/volumes" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.165756 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8bed08-cb47-42cb-a192-2545a14e4c4b" path="/var/lib/kubelet/pods/eb8bed08-cb47-42cb-a192-2545a14e4c4b/volumes" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.562651 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.564816 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.696112 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.716581 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:15:33 crc kubenswrapper[4962]: I0220 10:15:33.843771 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.529282 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerStarted","Data":"a7c3f6bf061e2f58df1199abfaabc0fa7edc0079e61af3f51614ef7b77cc0b31"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.550606 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.550674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"56a6ee57a73de30149034b6e88679c3be01baa1f232bfba996e0533713b1689a"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.555576 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerStarted","Data":"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.555650 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerStarted","Data":"256cfc6edb7fdfbe31dd4d739c6bcf21323de33dda20f71407beaea0eb6fd7bc"} Feb 20 10:15:34 crc kubenswrapper[4962]: I0220 10:15:34.668302 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.567205 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerStarted","Data":"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f"} Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.569676 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerStarted","Data":"fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed"} Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.569732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerStarted","Data":"2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12"} Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.571931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9"} Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.597373 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.5973470499999998 podStartE2EDuration="3.59734705s" podCreationTimestamp="2026-02-20 10:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:35.595147221 +0000 UTC m=+1227.177619067" watchObservedRunningTime="2026-02-20 10:15:35.59734705 +0000 UTC m=+1227.179818896" Feb 20 10:15:35 crc kubenswrapper[4962]: I0220 10:15:35.623276 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.623249755 podStartE2EDuration="3.623249755s" podCreationTimestamp="2026-02-20 10:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:35.620934682 +0000 UTC m=+1227.203406528" watchObservedRunningTime="2026-02-20 10:15:35.623249755 +0000 UTC m=+1227.205721601" Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.161173 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.281666 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.281972 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-ffdf447d4-qtmvr" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-api" containerID="cri-o://859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" gracePeriod=30 Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.282065 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-ffdf447d4-qtmvr" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-httpd" containerID="cri-o://58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" gracePeriod=30 Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.583231 4962 generic.go:334] "Generic (PLEG): container finished" podID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerID="58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" exitCode=0 Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.583296 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerDied","Data":"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574"} Feb 20 10:15:36 crc kubenswrapper[4962]: I0220 10:15:36.585551 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57"} Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.633511 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerStarted","Data":"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70"} Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634312 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="sg-core" containerID="cri-o://13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" gracePeriod=30 Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634436 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634117 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-central-agent" containerID="cri-o://1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" gracePeriod=30 Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634254 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="proxy-httpd" containerID="cri-o://fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" gracePeriod=30 Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.634347 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-notification-agent" containerID="cri-o://a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" gracePeriod=30 Feb 20 10:15:40 crc kubenswrapper[4962]: I0220 10:15:40.698691 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.94729775 podStartE2EDuration="8.698658828s" podCreationTimestamp="2026-02-20 10:15:32 +0000 UTC" firstStartedPulling="2026-02-20 10:15:33.724132722 +0000 UTC m=+1225.306604568" lastFinishedPulling="2026-02-20 10:15:39.4754938 +0000 UTC m=+1231.057965646" observedRunningTime="2026-02-20 10:15:40.682701716 +0000 UTC m=+1232.265173592" watchObservedRunningTime="2026-02-20 10:15:40.698658828 +0000 UTC m=+1232.281130684" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.494981 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.627715 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.627935 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.628130 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.628254 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.628296 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") pod \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\" (UID: \"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58\") " Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.638731 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.641776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm" (OuterVolumeSpecName: "kube-api-access-q54gm") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "kube-api-access-q54gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651229 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerID="fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" exitCode=0 Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651274 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerID="13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" exitCode=2 Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651291 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerID="a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" exitCode=0 Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651317 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651428 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.651466 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654110 4962 generic.go:334] "Generic (PLEG): container finished" podID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerID="859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" exitCode=0 Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654163 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerDied","Data":"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654198 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ffdf447d4-qtmvr" event={"ID":"0d2f2cbd-e871-48b8-acf1-b84c9c2abb58","Type":"ContainerDied","Data":"81c44508e7e551a0ec9263f4a7d0314158cbc47cdfa61ceff1466d2aef98334e"} Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654165 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ffdf447d4-qtmvr" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.654268 4962 scope.go:117] "RemoveContainer" containerID="58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.689782 4962 scope.go:117] "RemoveContainer" containerID="859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.701011 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config" (OuterVolumeSpecName: "config") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.702583 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.727682 4962 scope.go:117] "RemoveContainer" containerID="58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" Feb 20 10:15:41 crc kubenswrapper[4962]: E0220 10:15:41.728584 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574\": container with ID starting with 58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574 not found: ID does not exist" containerID="58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.728802 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574"} err="failed to get container status \"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574\": rpc error: code = NotFound desc = could not find container \"58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574\": container with ID starting with 58eaf7b7530713a5b8cc056bcb353fb3ce93d8f1f16e50ed82bd930298b3a574 not found: ID does not exist" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.728926 4962 scope.go:117] "RemoveContainer" containerID="859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" Feb 20 10:15:41 crc kubenswrapper[4962]: E0220 10:15:41.731082 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac\": container with ID starting with 859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac not found: ID does not exist" containerID="859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731147 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac"} err="failed to get container status \"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac\": rpc error: code = NotFound desc = could not find container \"859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac\": container with ID starting with 859f954e19a5beb8c7855b8aa76e3e17aaf6038a1b74a860dcfada7a5fc974ac not found: ID does not exist" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731718 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731761 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731776 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q54gm\" (UniqueName: \"kubernetes.io/projected/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-kube-api-access-q54gm\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.731791 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.756658 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" (UID: "0d2f2cbd-e871-48b8-acf1-b84c9c2abb58"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:41 crc kubenswrapper[4962]: I0220 10:15:41.833486 4962 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.013841 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.027164 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ffdf447d4-qtmvr"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.446410 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.451720 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.451902 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.451984 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452025 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452082 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452124 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452212 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") pod \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\" (UID: \"ff5064c2-d8a3-41f3-8d14-8794be8126e1\") " Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452775 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.452948 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.458747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts" (OuterVolumeSpecName: "scripts") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.461339 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc" (OuterVolumeSpecName: "kube-api-access-xtlzc") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "kube-api-access-xtlzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.530874 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554699 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554733 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554746 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554757 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtlzc\" (UniqueName: \"kubernetes.io/projected/ff5064c2-d8a3-41f3-8d14-8794be8126e1-kube-api-access-xtlzc\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.554767 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff5064c2-d8a3-41f3-8d14-8794be8126e1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.584530 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data" (OuterVolumeSpecName: "config-data") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.590536 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff5064c2-d8a3-41f3-8d14-8794be8126e1" (UID: "ff5064c2-d8a3-41f3-8d14-8794be8126e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.656074 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.656114 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5064c2-d8a3-41f3-8d14-8794be8126e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673204 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerID="1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" exitCode=0 Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673259 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75"} Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673334 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff5064c2-d8a3-41f3-8d14-8794be8126e1","Type":"ContainerDied","Data":"56a6ee57a73de30149034b6e88679c3be01baa1f232bfba996e0533713b1689a"} Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673358 4962 scope.go:117] "RemoveContainer" containerID="fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.673423 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.693426 4962 scope.go:117] "RemoveContainer" containerID="13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.720259 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.727050 4962 scope.go:117] "RemoveContainer" containerID="a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.734660 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.753018 4962 scope.go:117] "RemoveContainer" containerID="1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.763761 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764413 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="proxy-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764443 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="proxy-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764463 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="sg-core" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764474 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="sg-core" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764504 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-api" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764515 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-api" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764526 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764538 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764554 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-central-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764564 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-central-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.764582 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-notification-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.764607 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-notification-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.765912 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-central-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.765966 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="proxy-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.765978 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="sg-core" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.765993 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" containerName="ceilometer-notification-agent" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.766009 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-httpd" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.766025 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" containerName="neutron-api" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.768459 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.771384 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.771473 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.778916 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.830467 4962 scope.go:117] "RemoveContainer" containerID="fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.831230 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70\": container with ID starting with fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70 not found: ID does not exist" containerID="fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.831283 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70"} err="failed to get container status \"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70\": rpc error: code = NotFound desc = could not find container \"fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70\": container with ID starting with fea4e13e2936d66c08408a152d91847a310438d6f9a039655822220e95cd7d70 not found: ID does not exist" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.831321 4962 scope.go:117] "RemoveContainer" containerID="13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.832200 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57\": container with ID starting with 13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57 not found: ID does not exist" containerID="13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.832224 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57"} err="failed to get container status \"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57\": rpc error: code = NotFound desc = could not find container \"13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57\": container with ID starting with 13bad3767d50b4ae75c546d230a3426b48d37aaeebacd98a7984142e2dd61e57 not found: ID does not exist" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.832240 4962 scope.go:117] "RemoveContainer" containerID="a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.832967 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9\": container with ID starting with a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9 not found: ID does not exist" containerID="a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.833007 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9"} err="failed to get container status \"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9\": rpc error: code = NotFound desc = could not find container \"a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9\": container with ID starting with a7b748abf637bc1f2780d0b7363daee9f788e8ac71989a607a47ef823931d7e9 not found: ID does not exist" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.833037 4962 scope.go:117] "RemoveContainer" containerID="1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" Feb 20 10:15:42 crc kubenswrapper[4962]: E0220 10:15:42.833366 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75\": container with ID starting with 1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75 not found: ID does not exist" containerID="1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.833417 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75"} err="failed to get container status \"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75\": rpc error: code = NotFound desc = could not find container \"1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75\": container with ID starting with 1e1656ae6b0557accdedb4cfac1e40afd4babce4c2833536fa7a9a7b69032e75 not found: ID does not exist" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861174 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861298 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861345 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861424 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861480 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861606 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.861631 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.963529 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.964001 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.964528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.964635 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965185 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965232 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965316 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965350 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.965680 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.969111 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.970070 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.970974 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.983778 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:42 crc kubenswrapper[4962]: I0220 10:15:42.988867 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") pod \"ceilometer-0\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " pod="openstack/ceilometer-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.002615 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.002702 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.056952 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.057419 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.057476 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.068881 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.103783 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.119332 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.124100 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.183629 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2f2cbd-e871-48b8-acf1-b84c9c2abb58" path="/var/lib/kubelet/pods/0d2f2cbd-e871-48b8-acf1-b84c9c2abb58/volumes" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.184286 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5064c2-d8a3-41f3-8d14-8794be8126e1" path="/var/lib/kubelet/pods/ff5064c2-d8a3-41f3-8d14-8794be8126e1/volumes" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.623057 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688288 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"a3529eb7b3b629ef3c3b91bbe1d433d262412a55d8bda6f23a58ec282b63369e"} Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688829 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688873 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688884 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:43 crc kubenswrapper[4962]: I0220 10:15:43.688894 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 10:15:44 crc kubenswrapper[4962]: I0220 10:15:44.698496 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7"} Feb 20 10:15:44 crc kubenswrapper[4962]: I0220 10:15:44.951087 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.000975 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.077938 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.078582 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-755cb8b5f4-zlzbb" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-log" containerID="cri-o://c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f" gracePeriod=30 Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.078787 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-755cb8b5f4-zlzbb" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-api" containerID="cri-o://bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce" gracePeriod=30 Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.293721 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.321146 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.379777 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.383018 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.384178 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.389078 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.434260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.434353 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.442663 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.470857 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.472833 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.514875 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538182 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538210 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538312 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538354 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.538408 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.542135 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.564181 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") pod \"nova-api-db-create-9xxwl\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.589855 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.591578 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.615534 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.619675 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.629073 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.632886 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.645370 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.645444 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.645513 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.645535 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.646475 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.646745 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.656657 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.667223 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") pod \"nova-api-e96c-account-create-update-zd8bf\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.668415 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") pod \"nova-cell0-db-create-tbn8g\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.717263 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.738024 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.747374 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.747437 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.747474 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.747525 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.751307 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.752717 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.763624 4962 generic.go:334] "Generic (PLEG): container finished" podID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerID="c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f" exitCode=143 Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.763747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerDied","Data":"c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f"} Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.765460 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.774221 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.782207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470"} Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.849860 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.849947 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.850143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.850183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.850214 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.850248 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.851581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.853361 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.856409 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.871168 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") pod \"nova-cell0-7729-account-create-update-dttxs\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.878107 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") pod \"nova-cell1-db-create-xnwmz\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.953075 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.953172 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.957417 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.976264 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.990483 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") pod \"nova-cell1-a33d-account-create-update-6q8g4\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:45 crc kubenswrapper[4962]: I0220 10:15:45.997726 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.285056 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.350223 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.535955 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.605522 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:15:46 crc kubenswrapper[4962]: W0220 10:15:46.615925 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f853840_0af1_40ee_b11b_a0a62f9f4ebf.slice/crio-a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f WatchSource:0}: Error finding container a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f: Status 404 returned error can't find the container with id a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.781694 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.798751 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e96c-account-create-update-zd8bf" event={"ID":"84f50d98-6178-44d4-8ac4-43a8df4e3339","Type":"ContainerStarted","Data":"cca63b6df77c4f05f324336e287d723bbb4b8475a0e294b3803987c9653e4132"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.801046 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tbn8g" event={"ID":"1f853840-0af1-40ee-b11b-a0a62f9f4ebf","Type":"ContainerStarted","Data":"a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.802925 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xxwl" event={"ID":"7da93993-8b14-45f6-8d0b-8366becc762e","Type":"ContainerStarted","Data":"d5939f243f85e996e6d1902bb72680f0a5c1df9ab42c709cd744434161fb2db0"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.802953 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xxwl" event={"ID":"7da93993-8b14-45f6-8d0b-8366becc762e","Type":"ContainerStarted","Data":"bc0814885963d26026e551a006a093acd6a52246d2b26709ab56c1be71ccdd20"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.809620 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49"} Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.879268 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-9xxwl" podStartSLOduration=1.879237405 podStartE2EDuration="1.879237405s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:46.832222558 +0000 UTC m=+1238.414694424" watchObservedRunningTime="2026-02-20 10:15:46.879237405 +0000 UTC m=+1238.461709251" Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.889154 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.912020 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:46 crc kubenswrapper[4962]: I0220 10:15:46.912176 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.070639 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.117381 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.117534 4962 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.129281 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.139464 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.822213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" event={"ID":"032f830f-9636-4783-a048-00f9b7b22a3a","Type":"ContainerStarted","Data":"7230277cc1eb3909a3d3342c6f5ba88bcf14bbf39fe46da73616efba87702b09"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.822483 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" event={"ID":"032f830f-9636-4783-a048-00f9b7b22a3a","Type":"ContainerStarted","Data":"e4727e33fbaf4f2fd6f0e77e50474250b81bce0793abe10591d440e3d3f794f9"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.824834 4962 generic.go:334] "Generic (PLEG): container finished" podID="84f50d98-6178-44d4-8ac4-43a8df4e3339" containerID="e1788ed30c723d96dcb6e0f9484b28a97145a65cc9e3bff73edd5bbbf2ff0b13" exitCode=0 Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.824884 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e96c-account-create-update-zd8bf" event={"ID":"84f50d98-6178-44d4-8ac4-43a8df4e3339","Type":"ContainerDied","Data":"e1788ed30c723d96dcb6e0f9484b28a97145a65cc9e3bff73edd5bbbf2ff0b13"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.826400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xnwmz" event={"ID":"79394db3-1fa2-4b8f-927a-1cf8085f1df4","Type":"ContainerStarted","Data":"793db344d89e9339466a1f19a2e137b204724f58c385b41b9c74536f0d99e12b"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.826427 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xnwmz" event={"ID":"79394db3-1fa2-4b8f-927a-1cf8085f1df4","Type":"ContainerStarted","Data":"dd7896a5012e73b8ddeadf1951baeb976b785e72dfec1740274bc3a01d4e93d0"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.828008 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tbn8g" event={"ID":"1f853840-0af1-40ee-b11b-a0a62f9f4ebf","Type":"ContainerStarted","Data":"e3155d74dd6282e1fc794d27b2b712bbcb47529b5f2fbb3e8f768bc271110d45"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.832402 4962 generic.go:334] "Generic (PLEG): container finished" podID="7da93993-8b14-45f6-8d0b-8366becc762e" containerID="d5939f243f85e996e6d1902bb72680f0a5c1df9ab42c709cd744434161fb2db0" exitCode=0 Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.832455 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xxwl" event={"ID":"7da93993-8b14-45f6-8d0b-8366becc762e","Type":"ContainerDied","Data":"d5939f243f85e996e6d1902bb72680f0a5c1df9ab42c709cd744434161fb2db0"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.836031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerStarted","Data":"0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.836888 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.842582 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7729-account-create-update-dttxs" event={"ID":"85565888-6622-4dfc-9198-8e9c5b05cc75","Type":"ContainerStarted","Data":"e14d4499aad39130d8942043e6328de4fbc415b007670a0434fde9be884215b2"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.842638 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7729-account-create-update-dttxs" event={"ID":"85565888-6622-4dfc-9198-8e9c5b05cc75","Type":"ContainerStarted","Data":"d4010cdeffdef09dabf121775c79c0d5d454ac777c49cc0bb7a0999ba9e9cc4c"} Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.846370 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" podStartSLOduration=2.846355096 podStartE2EDuration="2.846355096s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:47.843566681 +0000 UTC m=+1239.426038527" watchObservedRunningTime="2026-02-20 10:15:47.846355096 +0000 UTC m=+1239.428826932" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.882730 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-7729-account-create-update-dttxs" podStartSLOduration=2.882709116 podStartE2EDuration="2.882709116s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:47.859552454 +0000 UTC m=+1239.442024300" watchObservedRunningTime="2026-02-20 10:15:47.882709116 +0000 UTC m=+1239.465180962" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.926574 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-tbn8g" podStartSLOduration=2.926548517 podStartE2EDuration="2.926548517s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:47.915990952 +0000 UTC m=+1239.498462798" watchObservedRunningTime="2026-02-20 10:15:47.926548517 +0000 UTC m=+1239.509020363" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.941480 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.533466568 podStartE2EDuration="5.941451435s" podCreationTimestamp="2026-02-20 10:15:42 +0000 UTC" firstStartedPulling="2026-02-20 10:15:43.633561896 +0000 UTC m=+1235.216033742" lastFinishedPulling="2026-02-20 10:15:47.041546773 +0000 UTC m=+1238.624018609" observedRunningTime="2026-02-20 10:15:47.936892605 +0000 UTC m=+1239.519364461" watchObservedRunningTime="2026-02-20 10:15:47.941451435 +0000 UTC m=+1239.523923301" Feb 20 10:15:47 crc kubenswrapper[4962]: I0220 10:15:47.960285 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-xnwmz" podStartSLOduration=2.960260815 podStartE2EDuration="2.960260815s" podCreationTimestamp="2026-02-20 10:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:15:47.955064295 +0000 UTC m=+1239.537536141" watchObservedRunningTime="2026-02-20 10:15:47.960260815 +0000 UTC m=+1239.542732661" Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.091933 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.850585 4962 generic.go:334] "Generic (PLEG): container finished" podID="85565888-6622-4dfc-9198-8e9c5b05cc75" containerID="e14d4499aad39130d8942043e6328de4fbc415b007670a0434fde9be884215b2" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.850790 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7729-account-create-update-dttxs" event={"ID":"85565888-6622-4dfc-9198-8e9c5b05cc75","Type":"ContainerDied","Data":"e14d4499aad39130d8942043e6328de4fbc415b007670a0434fde9be884215b2"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.857237 4962 generic.go:334] "Generic (PLEG): container finished" podID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerID="bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.857320 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerDied","Data":"bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.857367 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-755cb8b5f4-zlzbb" event={"ID":"b1b02597-c246-43dc-bd85-bebc40c70abf","Type":"ContainerDied","Data":"278c9072e567ac676f1ff447db5bfcb24f5eba477a61436baf57c6f5bf95aba9"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.857382 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="278c9072e567ac676f1ff447db5bfcb24f5eba477a61436baf57c6f5bf95aba9" Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.858572 4962 generic.go:334] "Generic (PLEG): container finished" podID="032f830f-9636-4783-a048-00f9b7b22a3a" containerID="7230277cc1eb3909a3d3342c6f5ba88bcf14bbf39fe46da73616efba87702b09" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.858636 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" event={"ID":"032f830f-9636-4783-a048-00f9b7b22a3a","Type":"ContainerDied","Data":"7230277cc1eb3909a3d3342c6f5ba88bcf14bbf39fe46da73616efba87702b09"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.859734 4962 generic.go:334] "Generic (PLEG): container finished" podID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" containerID="793db344d89e9339466a1f19a2e137b204724f58c385b41b9c74536f0d99e12b" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.859784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xnwmz" event={"ID":"79394db3-1fa2-4b8f-927a-1cf8085f1df4","Type":"ContainerDied","Data":"793db344d89e9339466a1f19a2e137b204724f58c385b41b9c74536f0d99e12b"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.860837 4962 generic.go:334] "Generic (PLEG): container finished" podID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" containerID="e3155d74dd6282e1fc794d27b2b712bbcb47529b5f2fbb3e8f768bc271110d45" exitCode=0 Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.861033 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tbn8g" event={"ID":"1f853840-0af1-40ee-b11b-a0a62f9f4ebf","Type":"ContainerDied","Data":"e3155d74dd6282e1fc794d27b2b712bbcb47529b5f2fbb3e8f768bc271110d45"} Feb 20 10:15:48 crc kubenswrapper[4962]: I0220 10:15:48.904213 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051380 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051434 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051655 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051686 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051730 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.051860 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") pod \"b1b02597-c246-43dc-bd85-bebc40c70abf\" (UID: \"b1b02597-c246-43dc-bd85-bebc40c70abf\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.058061 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs" (OuterVolumeSpecName: "logs") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.061285 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq" (OuterVolumeSpecName: "kube-api-access-ch9vq") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "kube-api-access-ch9vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.069737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts" (OuterVolumeSpecName: "scripts") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.224890 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.243711 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1b02597-c246-43dc-bd85-bebc40c70abf-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.244527 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data" (OuterVolumeSpecName: "config-data") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.245371 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.245631 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.245648 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch9vq\" (UniqueName: \"kubernetes.io/projected/b1b02597-c246-43dc-bd85-bebc40c70abf-kube-api-access-ch9vq\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.348068 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.355714 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.355843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b1b02597-c246-43dc-bd85-bebc40c70abf" (UID: "b1b02597-c246-43dc-bd85-bebc40c70abf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.386888 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.413031 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.453522 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.453551 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1b02597-c246-43dc-bd85-bebc40c70abf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.554419 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") pod \"7da93993-8b14-45f6-8d0b-8366becc762e\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.554539 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") pod \"84f50d98-6178-44d4-8ac4-43a8df4e3339\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.554698 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") pod \"84f50d98-6178-44d4-8ac4-43a8df4e3339\" (UID: \"84f50d98-6178-44d4-8ac4-43a8df4e3339\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.555023 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") pod \"7da93993-8b14-45f6-8d0b-8366becc762e\" (UID: \"7da93993-8b14-45f6-8d0b-8366becc762e\") " Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.555764 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84f50d98-6178-44d4-8ac4-43a8df4e3339" (UID: "84f50d98-6178-44d4-8ac4-43a8df4e3339"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.557563 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84f50d98-6178-44d4-8ac4-43a8df4e3339-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.558042 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7da93993-8b14-45f6-8d0b-8366becc762e" (UID: "7da93993-8b14-45f6-8d0b-8366becc762e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.558659 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg" (OuterVolumeSpecName: "kube-api-access-vjtsg") pod "7da93993-8b14-45f6-8d0b-8366becc762e" (UID: "7da93993-8b14-45f6-8d0b-8366becc762e"). InnerVolumeSpecName "kube-api-access-vjtsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.561945 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl" (OuterVolumeSpecName: "kube-api-access-n6njl") pod "84f50d98-6178-44d4-8ac4-43a8df4e3339" (UID: "84f50d98-6178-44d4-8ac4-43a8df4e3339"). InnerVolumeSpecName "kube-api-access-n6njl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.660438 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7da93993-8b14-45f6-8d0b-8366becc762e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.660492 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjtsg\" (UniqueName: \"kubernetes.io/projected/7da93993-8b14-45f6-8d0b-8366becc762e-kube-api-access-vjtsg\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.660505 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6njl\" (UniqueName: \"kubernetes.io/projected/84f50d98-6178-44d4-8ac4-43a8df4e3339-kube-api-access-n6njl\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.885220 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e96c-account-create-update-zd8bf" event={"ID":"84f50d98-6178-44d4-8ac4-43a8df4e3339","Type":"ContainerDied","Data":"cca63b6df77c4f05f324336e287d723bbb4b8475a0e294b3803987c9653e4132"} Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.885635 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca63b6df77c4f05f324336e287d723bbb4b8475a0e294b3803987c9653e4132" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.885386 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e96c-account-create-update-zd8bf" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.890635 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-755cb8b5f4-zlzbb" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.890477 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xxwl" event={"ID":"7da93993-8b14-45f6-8d0b-8366becc762e","Type":"ContainerDied","Data":"bc0814885963d26026e551a006a093acd6a52246d2b26709ab56c1be71ccdd20"} Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.890722 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc0814885963d26026e551a006a093acd6a52246d2b26709ab56c1be71ccdd20" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.890638 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xxwl" Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.892168 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-central-agent" containerID="cri-o://28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7" gracePeriod=30 Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.892337 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="sg-core" containerID="cri-o://bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49" gracePeriod=30 Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.892387 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-notification-agent" containerID="cri-o://b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470" gracePeriod=30 Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.892438 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="proxy-httpd" containerID="cri-o://0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276" gracePeriod=30 Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.946832 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:15:49 crc kubenswrapper[4962]: I0220 10:15:49.963417 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-755cb8b5f4-zlzbb"] Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.513651 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.588617 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") pod \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.588927 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") pod \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\" (UID: \"79394db3-1fa2-4b8f-927a-1cf8085f1df4\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.590409 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79394db3-1fa2-4b8f-927a-1cf8085f1df4" (UID: "79394db3-1fa2-4b8f-927a-1cf8085f1df4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.598803 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5" (OuterVolumeSpecName: "kube-api-access-gs8c5") pod "79394db3-1fa2-4b8f-927a-1cf8085f1df4" (UID: "79394db3-1fa2-4b8f-927a-1cf8085f1df4"). InnerVolumeSpecName "kube-api-access-gs8c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.690954 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs8c5\" (UniqueName: \"kubernetes.io/projected/79394db3-1fa2-4b8f-927a-1cf8085f1df4-kube-api-access-gs8c5\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.691000 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79394db3-1fa2-4b8f-927a-1cf8085f1df4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.708792 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.710879 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.751353 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.792444 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") pod \"032f830f-9636-4783-a048-00f9b7b22a3a\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.792534 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") pod \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.792689 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") pod \"032f830f-9636-4783-a048-00f9b7b22a3a\" (UID: \"032f830f-9636-4783-a048-00f9b7b22a3a\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.792933 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") pod \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\" (UID: \"1f853840-0af1-40ee-b11b-a0a62f9f4ebf\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.793904 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f853840-0af1-40ee-b11b-a0a62f9f4ebf" (UID: "1f853840-0af1-40ee-b11b-a0a62f9f4ebf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.795164 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "032f830f-9636-4783-a048-00f9b7b22a3a" (UID: "032f830f-9636-4783-a048-00f9b7b22a3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.799361 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6" (OuterVolumeSpecName: "kube-api-access-qkfp6") pod "032f830f-9636-4783-a048-00f9b7b22a3a" (UID: "032f830f-9636-4783-a048-00f9b7b22a3a"). InnerVolumeSpecName "kube-api-access-qkfp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.810810 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp" (OuterVolumeSpecName: "kube-api-access-tcjqp") pod "1f853840-0af1-40ee-b11b-a0a62f9f4ebf" (UID: "1f853840-0af1-40ee-b11b-a0a62f9f4ebf"). InnerVolumeSpecName "kube-api-access-tcjqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.895906 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") pod \"85565888-6622-4dfc-9198-8e9c5b05cc75\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.896435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") pod \"85565888-6622-4dfc-9198-8e9c5b05cc75\" (UID: \"85565888-6622-4dfc-9198-8e9c5b05cc75\") " Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897050 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897064 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/032f830f-9636-4783-a048-00f9b7b22a3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897078 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcjqp\" (UniqueName: \"kubernetes.io/projected/1f853840-0af1-40ee-b11b-a0a62f9f4ebf-kube-api-access-tcjqp\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897091 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkfp6\" (UniqueName: \"kubernetes.io/projected/032f830f-9636-4783-a048-00f9b7b22a3a-kube-api-access-qkfp6\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.897566 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85565888-6622-4dfc-9198-8e9c5b05cc75" (UID: "85565888-6622-4dfc-9198-8e9c5b05cc75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.903744 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt" (OuterVolumeSpecName: "kube-api-access-75wzt") pod "85565888-6622-4dfc-9198-8e9c5b05cc75" (UID: "85565888-6622-4dfc-9198-8e9c5b05cc75"). InnerVolumeSpecName "kube-api-access-75wzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.934888 4962 generic.go:334] "Generic (PLEG): container finished" podID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerID="0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276" exitCode=0 Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.934925 4962 generic.go:334] "Generic (PLEG): container finished" podID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerID="bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49" exitCode=2 Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.934932 4962 generic.go:334] "Generic (PLEG): container finished" podID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerID="b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470" exitCode=0 Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.934982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.935013 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.935026 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.939267 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7729-account-create-update-dttxs" event={"ID":"85565888-6622-4dfc-9198-8e9c5b05cc75","Type":"ContainerDied","Data":"d4010cdeffdef09dabf121775c79c0d5d454ac777c49cc0bb7a0999ba9e9cc4c"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.939295 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4010cdeffdef09dabf121775c79c0d5d454ac777c49cc0bb7a0999ba9e9cc4c" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.939352 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7729-account-create-update-dttxs" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.950313 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" event={"ID":"032f830f-9636-4783-a048-00f9b7b22a3a","Type":"ContainerDied","Data":"e4727e33fbaf4f2fd6f0e77e50474250b81bce0793abe10591d440e3d3f794f9"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.950360 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4727e33fbaf4f2fd6f0e77e50474250b81bce0793abe10591d440e3d3f794f9" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.950448 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a33d-account-create-update-6q8g4" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.956850 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xnwmz" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.956780 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xnwmz" event={"ID":"79394db3-1fa2-4b8f-927a-1cf8085f1df4","Type":"ContainerDied","Data":"dd7896a5012e73b8ddeadf1951baeb976b785e72dfec1740274bc3a01d4e93d0"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.957552 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7896a5012e73b8ddeadf1951baeb976b785e72dfec1740274bc3a01d4e93d0" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.962473 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tbn8g" event={"ID":"1f853840-0af1-40ee-b11b-a0a62f9f4ebf","Type":"ContainerDied","Data":"a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f"} Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.962520 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2f37bfcc6cd8021b2ad498dd933145508c79a393d0cad860e7277204ec6bc9f" Feb 20 10:15:50 crc kubenswrapper[4962]: I0220 10:15:50.962623 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tbn8g" Feb 20 10:15:51 crc kubenswrapper[4962]: I0220 10:15:51.009089 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75wzt\" (UniqueName: \"kubernetes.io/projected/85565888-6622-4dfc-9198-8e9c5b05cc75-kube-api-access-75wzt\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:51 crc kubenswrapper[4962]: I0220 10:15:51.009145 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85565888-6622-4dfc-9198-8e9c5b05cc75-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:51 crc kubenswrapper[4962]: I0220 10:15:51.152542 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" path="/var/lib/kubelet/pods/b1b02597-c246-43dc-bd85-bebc40c70abf/volumes" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.974425 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.975912 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032f830f-9636-4783-a048-00f9b7b22a3a" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.975934 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="032f830f-9636-4783-a048-00f9b7b22a3a" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.975951 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85565888-6622-4dfc-9198-8e9c5b05cc75" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.975958 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="85565888-6622-4dfc-9198-8e9c5b05cc75" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.975988 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da93993-8b14-45f6-8d0b-8366becc762e" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.975997 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da93993-8b14-45f6-8d0b-8366becc762e" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976009 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f50d98-6178-44d4-8ac4-43a8df4e3339" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976020 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f50d98-6178-44d4-8ac4-43a8df4e3339" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976035 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-log" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976045 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-log" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976057 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976065 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976084 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976093 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: E0220 10:15:55.976108 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-api" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976116 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-api" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976356 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976374 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-log" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976389 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b02597-c246-43dc-bd85-bebc40c70abf" containerName="placement-api" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976399 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="032f830f-9636-4783-a048-00f9b7b22a3a" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976412 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="85565888-6622-4dfc-9198-8e9c5b05cc75" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976431 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976441 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="84f50d98-6178-44d4-8ac4-43a8df4e3339" containerName="mariadb-account-create-update" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.976453 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da93993-8b14-45f6-8d0b-8366becc762e" containerName="mariadb-database-create" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.977380 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.980571 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.980822 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mb5nf" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.986033 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 10:15:55 crc kubenswrapper[4962]: I0220 10:15:55.996282 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.048602 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.048700 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.048740 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.048807 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.149376 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.149436 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.149473 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.149558 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.159196 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.159376 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.159936 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.171583 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") pod \"nova-cell0-conductor-db-sync-wbq67\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.296754 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:15:56 crc kubenswrapper[4962]: W0220 10:15:56.834039 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20663c25_09a7_4a31_9994_450f507d4ff1.slice/crio-2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8 WatchSource:0}: Error finding container 2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8: Status 404 returned error can't find the container with id 2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8 Feb 20 10:15:56 crc kubenswrapper[4962]: I0220 10:15:56.834306 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:15:57 crc kubenswrapper[4962]: I0220 10:15:57.043100 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbq67" event={"ID":"20663c25-09a7-4a31-9994-450f507d4ff1","Type":"ContainerStarted","Data":"2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8"} Feb 20 10:16:02 crc kubenswrapper[4962]: I0220 10:16:02.188947 4962 generic.go:334] "Generic (PLEG): container finished" podID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerID="28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7" exitCode=0 Feb 20 10:16:02 crc kubenswrapper[4962]: I0220 10:16:02.189732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7"} Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.235912 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2263355d-2fa1-4b5a-bfc2-9f362df5739d","Type":"ContainerDied","Data":"a3529eb7b3b629ef3c3b91bbe1d433d262412a55d8bda6f23a58ec282b63369e"} Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.237134 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3529eb7b3b629ef3c3b91bbe1d433d262412a55d8bda6f23a58ec282b63369e" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.389491 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503351 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503483 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503546 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503738 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503900 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503940 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.503980 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") pod \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\" (UID: \"2263355d-2fa1-4b5a-bfc2-9f362df5739d\") " Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.505329 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.505634 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.513357 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts" (OuterVolumeSpecName: "scripts") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.515795 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x" (OuterVolumeSpecName: "kube-api-access-7px4x") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "kube-api-access-7px4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.534131 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.588618 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606277 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606315 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606329 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606340 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7px4x\" (UniqueName: \"kubernetes.io/projected/2263355d-2fa1-4b5a-bfc2-9f362df5739d-kube-api-access-7px4x\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606355 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.606363 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2263355d-2fa1-4b5a-bfc2-9f362df5739d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.628005 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data" (OuterVolumeSpecName: "config-data") pod "2263355d-2fa1-4b5a-bfc2-9f362df5739d" (UID: "2263355d-2fa1-4b5a-bfc2-9f362df5739d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:06 crc kubenswrapper[4962]: I0220 10:16:06.708215 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2263355d-2fa1-4b5a-bfc2-9f362df5739d-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.258227 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbq67" event={"ID":"20663c25-09a7-4a31-9994-450f507d4ff1","Type":"ContainerStarted","Data":"3cc79122882da35c12762f52d1de73bf1a9ef430f240e775b44faf92fe147dab"} Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.258313 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.290045 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wbq67" podStartSLOduration=2.875283514 podStartE2EDuration="12.290023844s" podCreationTimestamp="2026-02-20 10:15:55 +0000 UTC" firstStartedPulling="2026-02-20 10:15:56.840373314 +0000 UTC m=+1248.422845160" lastFinishedPulling="2026-02-20 10:16:06.255113604 +0000 UTC m=+1257.837585490" observedRunningTime="2026-02-20 10:16:07.28046589 +0000 UTC m=+1258.862937746" watchObservedRunningTime="2026-02-20 10:16:07.290023844 +0000 UTC m=+1258.872495700" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.310670 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.317549 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351268 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:07 crc kubenswrapper[4962]: E0220 10:16:07.351815 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-notification-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351835 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-notification-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: E0220 10:16:07.351858 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-central-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351867 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-central-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: E0220 10:16:07.351891 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="sg-core" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351897 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="sg-core" Feb 20 10:16:07 crc kubenswrapper[4962]: E0220 10:16:07.351922 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="proxy-httpd" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.351928 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="proxy-httpd" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.352147 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="proxy-httpd" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.352169 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-central-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.352186 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="ceilometer-notification-agent" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.352198 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" containerName="sg-core" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.355767 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.360712 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.360776 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.382801 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532089 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532292 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532372 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532454 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532499 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.532542 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.634744 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.634845 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.634934 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.634989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.635206 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.635269 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.635314 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.636332 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.636481 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.645621 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.646263 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.657401 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.659775 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.662759 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") pod \"ceilometer-0\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " pod="openstack/ceilometer-0" Feb 20 10:16:07 crc kubenswrapper[4962]: I0220 10:16:07.675645 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:08 crc kubenswrapper[4962]: W0220 10:16:08.263900 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod457c772c_a7b8_40ea_8573_c483915687be.slice/crio-beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981 WatchSource:0}: Error finding container beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981: Status 404 returned error can't find the container with id beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981 Feb 20 10:16:08 crc kubenswrapper[4962]: I0220 10:16:08.270006 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:09 crc kubenswrapper[4962]: I0220 10:16:09.155288 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2263355d-2fa1-4b5a-bfc2-9f362df5739d" path="/var/lib/kubelet/pods/2263355d-2fa1-4b5a-bfc2-9f362df5739d/volumes" Feb 20 10:16:09 crc kubenswrapper[4962]: I0220 10:16:09.347337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6"} Feb 20 10:16:09 crc kubenswrapper[4962]: I0220 10:16:09.349114 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981"} Feb 20 10:16:10 crc kubenswrapper[4962]: I0220 10:16:10.365082 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250"} Feb 20 10:16:11 crc kubenswrapper[4962]: I0220 10:16:11.381497 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5"} Feb 20 10:16:11 crc kubenswrapper[4962]: I0220 10:16:11.508195 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:16:11 crc kubenswrapper[4962]: I0220 10:16:11.508642 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:16:12 crc kubenswrapper[4962]: I0220 10:16:12.396179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerStarted","Data":"93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa"} Feb 20 10:16:12 crc kubenswrapper[4962]: I0220 10:16:12.396838 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:16:12 crc kubenswrapper[4962]: I0220 10:16:12.439420 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.135795244 podStartE2EDuration="5.439395976s" podCreationTimestamp="2026-02-20 10:16:07 +0000 UTC" firstStartedPulling="2026-02-20 10:16:08.267859256 +0000 UTC m=+1259.850331142" lastFinishedPulling="2026-02-20 10:16:11.571460018 +0000 UTC m=+1263.153931874" observedRunningTime="2026-02-20 10:16:12.434568967 +0000 UTC m=+1264.017040823" watchObservedRunningTime="2026-02-20 10:16:12.439395976 +0000 UTC m=+1264.021867832" Feb 20 10:16:18 crc kubenswrapper[4962]: I0220 10:16:18.468326 4962 generic.go:334] "Generic (PLEG): container finished" podID="20663c25-09a7-4a31-9994-450f507d4ff1" containerID="3cc79122882da35c12762f52d1de73bf1a9ef430f240e775b44faf92fe147dab" exitCode=0 Feb 20 10:16:18 crc kubenswrapper[4962]: I0220 10:16:18.468411 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbq67" event={"ID":"20663c25-09a7-4a31-9994-450f507d4ff1","Type":"ContainerDied","Data":"3cc79122882da35c12762f52d1de73bf1a9ef430f240e775b44faf92fe147dab"} Feb 20 10:16:19 crc kubenswrapper[4962]: I0220 10:16:19.931343 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.081864 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") pod \"20663c25-09a7-4a31-9994-450f507d4ff1\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.081969 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") pod \"20663c25-09a7-4a31-9994-450f507d4ff1\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.082227 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") pod \"20663c25-09a7-4a31-9994-450f507d4ff1\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.082302 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") pod \"20663c25-09a7-4a31-9994-450f507d4ff1\" (UID: \"20663c25-09a7-4a31-9994-450f507d4ff1\") " Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.095202 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts" (OuterVolumeSpecName: "scripts") pod "20663c25-09a7-4a31-9994-450f507d4ff1" (UID: "20663c25-09a7-4a31-9994-450f507d4ff1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.095408 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs" (OuterVolumeSpecName: "kube-api-access-r5kqs") pod "20663c25-09a7-4a31-9994-450f507d4ff1" (UID: "20663c25-09a7-4a31-9994-450f507d4ff1"). InnerVolumeSpecName "kube-api-access-r5kqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.119308 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data" (OuterVolumeSpecName: "config-data") pod "20663c25-09a7-4a31-9994-450f507d4ff1" (UID: "20663c25-09a7-4a31-9994-450f507d4ff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.139776 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20663c25-09a7-4a31-9994-450f507d4ff1" (UID: "20663c25-09a7-4a31-9994-450f507d4ff1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.185797 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.185854 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.185879 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5kqs\" (UniqueName: \"kubernetes.io/projected/20663c25-09a7-4a31-9994-450f507d4ff1-kube-api-access-r5kqs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.185901 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20663c25-09a7-4a31-9994-450f507d4ff1-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.496129 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbq67" event={"ID":"20663c25-09a7-4a31-9994-450f507d4ff1","Type":"ContainerDied","Data":"2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8"} Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.496184 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b76e69cf6b9d992b74e2baa8543568fc64a79a10dac0dbf45cd8ccaa97392d8" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.496261 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbq67" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.716988 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:16:20 crc kubenswrapper[4962]: E0220 10:16:20.717732 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20663c25-09a7-4a31-9994-450f507d4ff1" containerName="nova-cell0-conductor-db-sync" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.717752 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="20663c25-09a7-4a31-9994-450f507d4ff1" containerName="nova-cell0-conductor-db-sync" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.717987 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="20663c25-09a7-4a31-9994-450f507d4ff1" containerName="nova-cell0-conductor-db-sync" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.718878 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.727675 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.728056 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mb5nf" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.734171 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.901527 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.901706 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:20 crc kubenswrapper[4962]: I0220 10:16:20.903158 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.006296 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.006947 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.007080 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.014641 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.025007 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.039061 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") pod \"nova-cell0-conductor-0\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.061821 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:21 crc kubenswrapper[4962]: I0220 10:16:21.608002 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:16:22 crc kubenswrapper[4962]: I0220 10:16:22.529022 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"815f0ef8-a30a-4467-bb56-ff8499a4be44","Type":"ContainerStarted","Data":"5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69"} Feb 20 10:16:22 crc kubenswrapper[4962]: I0220 10:16:22.529617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"815f0ef8-a30a-4467-bb56-ff8499a4be44","Type":"ContainerStarted","Data":"5925bb54309b7a0a7036656c54ac3f8deef63680ce4f7825beb5965502489453"} Feb 20 10:16:22 crc kubenswrapper[4962]: I0220 10:16:22.530141 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:22 crc kubenswrapper[4962]: I0220 10:16:22.560211 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5601851289999997 podStartE2EDuration="2.560185129s" podCreationTimestamp="2026-02-20 10:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:22.552012868 +0000 UTC m=+1274.134484724" watchObservedRunningTime="2026-02-20 10:16:22.560185129 +0000 UTC m=+1274.142656975" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.098309 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.788684 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.791068 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.796009 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.797935 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.811319 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.889903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.889980 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.890038 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.890091 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.988924 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.990347 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.992246 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.992308 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.992354 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:26 crc kubenswrapper[4962]: I0220 10:16:26.992435 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.010096 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.011321 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.013557 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.015230 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.025674 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.034625 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") pod \"nova-cell0-cell-mapping-kjq4f\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.079301 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.080700 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.090441 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.100381 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.106876 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.107180 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.107355 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.123024 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.187162 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.189142 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.208964 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.225420 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244499 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244530 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244583 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244628 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244818 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.244903 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.252531 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.257481 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.259461 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.259553 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.274103 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.299161 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") pod \"nova-cell1-novncproxy-0\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.342864 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349143 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349221 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349257 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349291 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349355 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349375 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.349405 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.370132 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.378102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.378155 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.378288 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.380043 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.384653 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.384837 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.395248 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") pod \"nova-metadata-0\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.399355 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") pod \"nova-scheduler-0\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.419289 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.423902 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.435738 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.472778 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.472941 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.472967 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.472992 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473080 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473100 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473164 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473201 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.473266 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576196 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576276 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576339 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576362 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576390 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576421 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576442 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576474 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.576497 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.578478 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.579038 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.579581 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.580031 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.587980 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.603250 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.605674 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.612296 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") pod \"dnsmasq-dns-849fff7679-6w4jk\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.621273 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.625478 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") pod \"nova-api-0\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.670304 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.713373 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:27 crc kubenswrapper[4962]: I0220 10:16:27.749088 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.105770 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.333081 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.334953 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.341371 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.341639 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.396263 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.417102 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:28 crc kubenswrapper[4962]: W0220 10:16:28.468171 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d60578e_e3d0_4ae9_8539_9dfd84ebf836.slice/crio-bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504 WatchSource:0}: Error finding container bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504: Status 404 returned error can't find the container with id bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504 Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.469422 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.506013 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.531127 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.531222 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.531396 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.531540 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.633155 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.633749 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.633786 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.633820 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.648456 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.648520 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.648844 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.653823 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.658862 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") pod \"nova-cell1-conductor-db-sync-ct4qz\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.689572 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.698246 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.731276 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjq4f" event={"ID":"28bfacb3-7247-41ad-bf30-47c81427487b","Type":"ContainerStarted","Data":"1eb9947e80af1012b6145dccb54cd11c0689239b2a15c94816fdca73015d8cfe"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.731344 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjq4f" event={"ID":"28bfacb3-7247-41ad-bf30-47c81427487b","Type":"ContainerStarted","Data":"0f0bc1171f843d6af2226a5d8d968c25ad3b7d5cd1cf52d106ae84ff241ffecd"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.732674 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerStarted","Data":"9cb17bd0f831295fec6db28bdfcd5a1a3d6be987d43cd0f564f65a201a071dcf"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.733950 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea","Type":"ContainerStarted","Data":"503b6078fa852a67bd1c7bbebbc0925a0a08be36053fb3fee5407b0136117e50"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.735364 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1d60578e-e3d0-4ae9-8539-9dfd84ebf836","Type":"ContainerStarted","Data":"bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.736661 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerStarted","Data":"a7e2d55de2a0689981aefd86aae03373f74c3435dacca5e0c076361b42cb5da4"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.738093 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerStarted","Data":"323c0906415aaaf20c526ccc0a5760d7fccfd56336d524a538bc100ce5c3c6b2"} Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.752635 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kjq4f" podStartSLOduration=2.7526164509999997 podStartE2EDuration="2.752616451s" podCreationTimestamp="2026-02-20 10:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:28.747514104 +0000 UTC m=+1280.329985950" watchObservedRunningTime="2026-02-20 10:16:28.752616451 +0000 UTC m=+1280.335088297" Feb 20 10:16:28 crc kubenswrapper[4962]: I0220 10:16:28.981895 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.754258 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" event={"ID":"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d","Type":"ContainerStarted","Data":"c43d694b0ea8172a2db698ac63ac57a6cb364529c6c87ffb777fc946029b6b2f"} Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.754732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" event={"ID":"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d","Type":"ContainerStarted","Data":"864d77909ed3ce2537874b3a198bcea7df1e3a117b50ccd248c61b191ba8d805"} Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.794499 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" podStartSLOduration=1.7943583410000001 podStartE2EDuration="1.794358341s" podCreationTimestamp="2026-02-20 10:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:29.784346642 +0000 UTC m=+1281.366818488" watchObservedRunningTime="2026-02-20 10:16:29.794358341 +0000 UTC m=+1281.376830187" Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.797225 4962 generic.go:334] "Generic (PLEG): container finished" podID="619a1578-177c-476f-a471-e39ec43ebf20" containerID="6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a" exitCode=0 Feb 20 10:16:29 crc kubenswrapper[4962]: I0220 10:16:29.798746 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerDied","Data":"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a"} Feb 20 10:16:31 crc kubenswrapper[4962]: I0220 10:16:31.285308 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:31 crc kubenswrapper[4962]: I0220 10:16:31.293893 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.840805 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerStarted","Data":"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.841524 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.843152 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerStarted","Data":"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.847668 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea","Type":"ContainerStarted","Data":"e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.849640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1d60578e-e3d0-4ae9-8539-9dfd84ebf836","Type":"ContainerStarted","Data":"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.849895 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" gracePeriod=30 Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.859204 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerStarted","Data":"7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff"} Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.894156 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" podStartSLOduration=5.8941290859999995 podStartE2EDuration="5.894129086s" podCreationTimestamp="2026-02-20 10:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:32.871891541 +0000 UTC m=+1284.454363387" watchObservedRunningTime="2026-02-20 10:16:32.894129086 +0000 UTC m=+1284.476600922" Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.894841 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.133593093 podStartE2EDuration="5.894836118s" podCreationTimestamp="2026-02-20 10:16:27 +0000 UTC" firstStartedPulling="2026-02-20 10:16:28.418517423 +0000 UTC m=+1280.000989269" lastFinishedPulling="2026-02-20 10:16:32.179760448 +0000 UTC m=+1283.762232294" observedRunningTime="2026-02-20 10:16:32.887883915 +0000 UTC m=+1284.470355761" watchObservedRunningTime="2026-02-20 10:16:32.894836118 +0000 UTC m=+1284.477307964" Feb 20 10:16:32 crc kubenswrapper[4962]: I0220 10:16:32.918849 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.214600108 podStartE2EDuration="6.918824337s" podCreationTimestamp="2026-02-20 10:16:26 +0000 UTC" firstStartedPulling="2026-02-20 10:16:28.474665602 +0000 UTC m=+1280.057137448" lastFinishedPulling="2026-02-20 10:16:32.178889821 +0000 UTC m=+1283.761361677" observedRunningTime="2026-02-20 10:16:32.910565163 +0000 UTC m=+1284.493037009" watchObservedRunningTime="2026-02-20 10:16:32.918824337 +0000 UTC m=+1284.501296183" Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.881119 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerStarted","Data":"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66"} Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.881230 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-log" containerID="cri-o://398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" gracePeriod=30 Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.881316 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-metadata" containerID="cri-o://82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" gracePeriod=30 Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.894475 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerStarted","Data":"9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40"} Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.921193 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.241551942 podStartE2EDuration="6.921164434s" podCreationTimestamp="2026-02-20 10:16:27 +0000 UTC" firstStartedPulling="2026-02-20 10:16:28.4992947 +0000 UTC m=+1280.081766546" lastFinishedPulling="2026-02-20 10:16:32.178907192 +0000 UTC m=+1283.761379038" observedRunningTime="2026-02-20 10:16:33.90840371 +0000 UTC m=+1285.490875556" watchObservedRunningTime="2026-02-20 10:16:33.921164434 +0000 UTC m=+1285.503636280" Feb 20 10:16:33 crc kubenswrapper[4962]: I0220 10:16:33.946112 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.475433414 podStartE2EDuration="6.946085741s" podCreationTimestamp="2026-02-20 10:16:27 +0000 UTC" firstStartedPulling="2026-02-20 10:16:28.70615241 +0000 UTC m=+1280.288624256" lastFinishedPulling="2026-02-20 10:16:32.176804727 +0000 UTC m=+1283.759276583" observedRunningTime="2026-02-20 10:16:33.941042056 +0000 UTC m=+1285.523513892" watchObservedRunningTime="2026-02-20 10:16:33.946085741 +0000 UTC m=+1285.528557587" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.514341 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.710397 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") pod \"ab0c66b4-1ce3-4594-8780-2effddad7043\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.710481 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") pod \"ab0c66b4-1ce3-4594-8780-2effddad7043\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.710599 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") pod \"ab0c66b4-1ce3-4594-8780-2effddad7043\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.710726 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") pod \"ab0c66b4-1ce3-4594-8780-2effddad7043\" (UID: \"ab0c66b4-1ce3-4594-8780-2effddad7043\") " Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.711497 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs" (OuterVolumeSpecName: "logs") pod "ab0c66b4-1ce3-4594-8780-2effddad7043" (UID: "ab0c66b4-1ce3-4594-8780-2effddad7043"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.729875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz" (OuterVolumeSpecName: "kube-api-access-jjcbz") pod "ab0c66b4-1ce3-4594-8780-2effddad7043" (UID: "ab0c66b4-1ce3-4594-8780-2effddad7043"). InnerVolumeSpecName "kube-api-access-jjcbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.765809 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data" (OuterVolumeSpecName: "config-data") pod "ab0c66b4-1ce3-4594-8780-2effddad7043" (UID: "ab0c66b4-1ce3-4594-8780-2effddad7043"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.767367 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab0c66b4-1ce3-4594-8780-2effddad7043" (UID: "ab0c66b4-1ce3-4594-8780-2effddad7043"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.813444 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.813495 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c66b4-1ce3-4594-8780-2effddad7043-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.813508 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/ab0c66b4-1ce3-4594-8780-2effddad7043-kube-api-access-jjcbz\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.813522 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c66b4-1ce3-4594-8780-2effddad7043-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905658 4962 generic.go:334] "Generic (PLEG): container finished" podID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" exitCode=0 Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905702 4962 generic.go:334] "Generic (PLEG): container finished" podID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" exitCode=143 Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905736 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905765 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerDied","Data":"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66"} Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905831 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerDied","Data":"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13"} Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905846 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c66b4-1ce3-4594-8780-2effddad7043","Type":"ContainerDied","Data":"9cb17bd0f831295fec6db28bdfcd5a1a3d6be987d43cd0f564f65a201a071dcf"} Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.905876 4962 scope.go:117] "RemoveContainer" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.943733 4962 scope.go:117] "RemoveContainer" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.948659 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.960898 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.986882 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:34 crc kubenswrapper[4962]: E0220 10:16:34.987388 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-metadata" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.987409 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-metadata" Feb 20 10:16:34 crc kubenswrapper[4962]: E0220 10:16:34.987436 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-log" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.987444 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-log" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.987627 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-metadata" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.987658 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" containerName="nova-metadata-log" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.989220 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.996232 4962 scope.go:117] "RemoveContainer" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" Feb 20 10:16:34 crc kubenswrapper[4962]: E0220 10:16:34.997902 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": container with ID starting with 82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66 not found: ID does not exist" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.997963 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66"} err="failed to get container status \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": rpc error: code = NotFound desc = could not find container \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": container with ID starting with 82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66 not found: ID does not exist" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.998003 4962 scope.go:117] "RemoveContainer" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" Feb 20 10:16:34 crc kubenswrapper[4962]: E0220 10:16:34.998447 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": container with ID starting with 398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13 not found: ID does not exist" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.998499 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13"} err="failed to get container status \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": rpc error: code = NotFound desc = could not find container \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": container with ID starting with 398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13 not found: ID does not exist" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.998539 4962 scope.go:117] "RemoveContainer" containerID="82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.999037 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66"} err="failed to get container status \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": rpc error: code = NotFound desc = could not find container \"82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66\": container with ID starting with 82ffc55481641557dc07c98521f98b528e5b0c6b683aee7d0299fc0b0e9d9b66 not found: ID does not exist" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.999058 4962 scope.go:117] "RemoveContainer" containerID="398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13" Feb 20 10:16:34 crc kubenswrapper[4962]: I0220 10:16:34.999284 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13"} err="failed to get container status \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": rpc error: code = NotFound desc = could not find container \"398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13\": container with ID starting with 398ce78cdc2fc552726d146a95046f92240d58a6a2364f8373a866dc6c8d4c13 not found: ID does not exist" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.000664 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.000666 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.011794 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120046 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120257 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120294 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120314 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.120339 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.163879 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0c66b4-1ce3-4594-8780-2effddad7043" path="/var/lib/kubelet/pods/ab0c66b4-1ce3-4594-8780-2effddad7043/volumes" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.222861 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.223022 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.223052 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.223082 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.223114 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.225575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.230551 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.233666 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.236032 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.245575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") pod \"nova-metadata-0\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.340124 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:35 crc kubenswrapper[4962]: W0220 10:16:35.907370 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2aa0dbd_0022_4ee1_8bb9_81a20d6a4abd.slice/crio-69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081 WatchSource:0}: Error finding container 69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081: Status 404 returned error can't find the container with id 69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081 Feb 20 10:16:35 crc kubenswrapper[4962]: I0220 10:16:35.918067 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.942358 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerStarted","Data":"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81"} Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.943017 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerStarted","Data":"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078"} Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.943037 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerStarted","Data":"69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081"} Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.945470 4962 generic.go:334] "Generic (PLEG): container finished" podID="28bfacb3-7247-41ad-bf30-47c81427487b" containerID="1eb9947e80af1012b6145dccb54cd11c0689239b2a15c94816fdca73015d8cfe" exitCode=0 Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.945533 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjq4f" event={"ID":"28bfacb3-7247-41ad-bf30-47c81427487b","Type":"ContainerDied","Data":"1eb9947e80af1012b6145dccb54cd11c0689239b2a15c94816fdca73015d8cfe"} Feb 20 10:16:36 crc kubenswrapper[4962]: I0220 10:16:36.973542 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.973516809 podStartE2EDuration="2.973516809s" podCreationTimestamp="2026-02-20 10:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:36.967959947 +0000 UTC m=+1288.550431803" watchObservedRunningTime="2026-02-20 10:16:36.973516809 +0000 UTC m=+1288.555988665" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.420689 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.424953 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.424981 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.464356 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.691689 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.714730 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.714848 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.751765 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.825904 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.826150 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="dnsmasq-dns" containerID="cri-o://8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc" gracePeriod=10 Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.969323 4962 generic.go:334] "Generic (PLEG): container finished" podID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerID="8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc" exitCode=0 Feb 20 10:16:37 crc kubenswrapper[4962]: I0220 10:16:37.969409 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerDied","Data":"8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc"} Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.007993 4962 generic.go:334] "Generic (PLEG): container finished" podID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" containerID="c43d694b0ea8172a2db698ac63ac57a6cb364529c6c87ffb777fc946029b6b2f" exitCode=0 Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.008296 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" event={"ID":"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d","Type":"ContainerDied","Data":"c43d694b0ea8172a2db698ac63ac57a6cb364529c6c87ffb777fc946029b6b2f"} Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.361481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.599631 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.735817 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") pod \"28bfacb3-7247-41ad-bf30-47c81427487b\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.735902 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") pod \"28bfacb3-7247-41ad-bf30-47c81427487b\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.735962 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") pod \"28bfacb3-7247-41ad-bf30-47c81427487b\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.736013 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") pod \"28bfacb3-7247-41ad-bf30-47c81427487b\" (UID: \"28bfacb3-7247-41ad-bf30-47c81427487b\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.743723 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts" (OuterVolumeSpecName: "scripts") pod "28bfacb3-7247-41ad-bf30-47c81427487b" (UID: "28bfacb3-7247-41ad-bf30-47c81427487b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.748064 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn" (OuterVolumeSpecName: "kube-api-access-fxmdn") pod "28bfacb3-7247-41ad-bf30-47c81427487b" (UID: "28bfacb3-7247-41ad-bf30-47c81427487b"). InnerVolumeSpecName "kube-api-access-fxmdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.755846 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.769420 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data" (OuterVolumeSpecName: "config-data") pod "28bfacb3-7247-41ad-bf30-47c81427487b" (UID: "28bfacb3-7247-41ad-bf30-47c81427487b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.784999 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28bfacb3-7247-41ad-bf30-47c81427487b" (UID: "28bfacb3-7247-41ad-bf30-47c81427487b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.804086 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.804152 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838108 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838242 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838299 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838389 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838777 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.838813 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") pod \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\" (UID: \"b97f91e3-f497-47ad-8d3d-f9945b3bdc34\") " Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.839302 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.839315 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.839324 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxmdn\" (UniqueName: \"kubernetes.io/projected/28bfacb3-7247-41ad-bf30-47c81427487b-kube-api-access-fxmdn\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.839335 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28bfacb3-7247-41ad-bf30-47c81427487b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.847584 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5" (OuterVolumeSpecName: "kube-api-access-54bp5") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "kube-api-access-54bp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.900711 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config" (OuterVolumeSpecName: "config") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.910907 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.912345 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.915119 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.926321 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b97f91e3-f497-47ad-8d3d-f9945b3bdc34" (UID: "b97f91e3-f497-47ad-8d3d-f9945b3bdc34"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.943953 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944003 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944017 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944027 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944037 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54bp5\" (UniqueName: \"kubernetes.io/projected/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-kube-api-access-54bp5\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:38 crc kubenswrapper[4962]: I0220 10:16:38.944049 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b97f91e3-f497-47ad-8d3d-f9945b3bdc34-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.018224 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjq4f" event={"ID":"28bfacb3-7247-41ad-bf30-47c81427487b","Type":"ContainerDied","Data":"0f0bc1171f843d6af2226a5d8d968c25ad3b7d5cd1cf52d106ae84ff241ffecd"} Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.018279 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjq4f" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.018281 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f0bc1171f843d6af2226a5d8d968c25ad3b7d5cd1cf52d106ae84ff241ffecd" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.020660 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" event={"ID":"b97f91e3-f497-47ad-8d3d-f9945b3bdc34","Type":"ContainerDied","Data":"a843159467f3af1797e47fcec1255f25b565f911c5d9a7e1acd289df047ed115"} Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.020739 4962 scope.go:117] "RemoveContainer" containerID="8064c9f4f1fa3fedd418dd367b2c5bee617312041e75d25a0c61bc72b1e5d8dc" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.020685 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-fkwk8" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.072639 4962 scope.go:117] "RemoveContainer" containerID="b058ae21e0210458ea10ea644bf00ea0438ea36899818d84b992ed449c70fc86" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.085381 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.097411 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-fkwk8"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.237581 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" path="/var/lib/kubelet/pods/b97f91e3-f497-47ad-8d3d-f9945b3bdc34/volumes" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.254692 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.255106 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" containerID="cri-o://7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff" gracePeriod=30 Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.255889 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" containerID="cri-o://9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40" gracePeriod=30 Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.269567 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.270025 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-log" containerID="cri-o://385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" gracePeriod=30 Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.270780 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-metadata" containerID="cri-o://f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" gracePeriod=30 Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.569448 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.576426 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.671280 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") pod \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.671408 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") pod \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.671629 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") pod \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.671696 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") pod \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\" (UID: \"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.678009 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts" (OuterVolumeSpecName: "scripts") pod "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" (UID: "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.682256 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662" (OuterVolumeSpecName: "kube-api-access-d5662") pod "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" (UID: "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d"). InnerVolumeSpecName "kube-api-access-d5662". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.717692 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" (UID: "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.719239 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data" (OuterVolumeSpecName: "config-data") pod "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" (UID: "05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.775952 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5662\" (UniqueName: \"kubernetes.io/projected/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-kube-api-access-d5662\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.775985 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.775996 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.776007 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.832209 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.881527 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.881639 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.881717 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.881916 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.882123 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") pod \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\" (UID: \"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd\") " Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.884379 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs" (OuterVolumeSpecName: "logs") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.888533 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l" (OuterVolumeSpecName: "kube-api-access-tld9l") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "kube-api-access-tld9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.922721 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.922771 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data" (OuterVolumeSpecName: "config-data") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.960050 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" (UID: "d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984741 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984803 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984821 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984837 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tld9l\" (UniqueName: \"kubernetes.io/projected/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-kube-api-access-tld9l\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:39 crc kubenswrapper[4962]: I0220 10:16:39.984850 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031780 4962 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" exitCode=0 Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031820 4962 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" exitCode=143 Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031857 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerDied","Data":"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031889 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerDied","Data":"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031900 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd","Type":"ContainerDied","Data":"69e12a1e1496627bbe3adb2526f1f1562b02a5a3863a278268713ef978db8081"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.031917 4962 scope.go:117] "RemoveContainer" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.032119 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.033786 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" event={"ID":"05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d","Type":"ContainerDied","Data":"864d77909ed3ce2537874b3a198bcea7df1e3a117b50ccd248c61b191ba8d805"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.033815 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ct4qz" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.033840 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="864d77909ed3ce2537874b3a198bcea7df1e3a117b50ccd248c61b191ba8d805" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.038559 4962 generic.go:334] "Generic (PLEG): container finished" podID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerID="7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff" exitCode=143 Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.038640 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerDied","Data":"7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff"} Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.041052 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" containerID="cri-o://e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" gracePeriod=30 Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.085634 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.093375 4962 scope.go:117] "RemoveContainer" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.130277 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.138002 4962 scope.go:117] "RemoveContainer" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.145711 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": container with ID starting with f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81 not found: ID does not exist" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.145764 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81"} err="failed to get container status \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": rpc error: code = NotFound desc = could not find container \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": container with ID starting with f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81 not found: ID does not exist" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.145842 4962 scope.go:117] "RemoveContainer" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.146462 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": container with ID starting with 385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078 not found: ID does not exist" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.146487 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078"} err="failed to get container status \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": rpc error: code = NotFound desc = could not find container \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": container with ID starting with 385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078 not found: ID does not exist" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.146505 4962 scope.go:117] "RemoveContainer" containerID="f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.147024 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81"} err="failed to get container status \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": rpc error: code = NotFound desc = could not find container \"f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81\": container with ID starting with f43ff895f125404ffd2cb5094fc64a24a87bc9f55f64acd2825ce51f46637e81 not found: ID does not exist" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.147056 4962 scope.go:117] "RemoveContainer" containerID="385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.147844 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078"} err="failed to get container status \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": rpc error: code = NotFound desc = could not find container \"385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078\": container with ID starting with 385d294ad925ff6a7f64008c9e42bd73f55a3c1dd6c1a61df30f589bcf465078 not found: ID does not exist" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166144 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166552 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28bfacb3-7247-41ad-bf30-47c81427487b" containerName="nova-manage" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166574 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="28bfacb3-7247-41ad-bf30-47c81427487b" containerName="nova-manage" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166586 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="dnsmasq-dns" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166611 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="dnsmasq-dns" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166635 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-metadata" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166644 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-metadata" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166661 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" containerName="nova-cell1-conductor-db-sync" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166669 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" containerName="nova-cell1-conductor-db-sync" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166681 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="init" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166687 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="init" Feb 20 10:16:40 crc kubenswrapper[4962]: E0220 10:16:40.166702 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-log" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166710 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-log" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166884 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-metadata" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166894 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97f91e3-f497-47ad-8d3d-f9945b3bdc34" containerName="dnsmasq-dns" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166905 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" containerName="nova-metadata-log" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166918 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="28bfacb3-7247-41ad-bf30-47c81427487b" containerName="nova-manage" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.166931 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" containerName="nova-cell1-conductor-db-sync" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.167920 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.170562 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.170837 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.199226 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.217760 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.219143 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.221897 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.222742 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.292799 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.292932 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.292961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293043 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293127 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293337 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.293410 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.395970 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396105 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396210 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396233 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396267 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396285 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396309 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.396421 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.397030 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.403221 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.403429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.404964 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.408274 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.421373 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.423002 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.425519 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") pod \"nova-metadata-0\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.490799 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:16:40 crc kubenswrapper[4962]: I0220 10:16:40.539017 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:41 crc kubenswrapper[4962]: W0220 10:16:41.020888 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf680b24_e6dc_40a4_9ee4_521343fd9a28.slice/crio-121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53 WatchSource:0}: Error finding container 121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53: Status 404 returned error can't find the container with id 121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53 Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.031888 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.055081 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerStarted","Data":"121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53"} Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.136981 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.201199 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd" path="/var/lib/kubelet/pods/d2aa0dbd-0022-4ee1-8bb9-81a20d6a4abd/volumes" Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.508439 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:16:41 crc kubenswrapper[4962]: I0220 10:16:41.509120 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.068117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerStarted","Data":"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e"} Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.068667 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerStarted","Data":"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e"} Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.073223 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ce62af15-166f-4f74-a244-2de5147a4b2f","Type":"ContainerStarted","Data":"2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592"} Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.073286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ce62af15-166f-4f74-a244-2de5147a4b2f","Type":"ContainerStarted","Data":"4fa1fbefe8085f86ec2949fb3171b5df9f6211664e4db89dbc2b776f71f19d88"} Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.074349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.097816 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.097788778 podStartE2EDuration="2.097788778s" podCreationTimestamp="2026-02-20 10:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:42.090337599 +0000 UTC m=+1293.672809465" watchObservedRunningTime="2026-02-20 10:16:42.097788778 +0000 UTC m=+1293.680260634" Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.119288 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.119254518 podStartE2EDuration="2.119254518s" podCreationTimestamp="2026-02-20 10:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:42.117229366 +0000 UTC m=+1293.699701212" watchObservedRunningTime="2026-02-20 10:16:42.119254518 +0000 UTC m=+1293.701726374" Feb 20 10:16:42 crc kubenswrapper[4962]: E0220 10:16:42.427408 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:16:42 crc kubenswrapper[4962]: E0220 10:16:42.429901 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:16:42 crc kubenswrapper[4962]: E0220 10:16:42.432084 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:16:42 crc kubenswrapper[4962]: E0220 10:16:42.432151 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.828546 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:42 crc kubenswrapper[4962]: I0220 10:16:42.829075 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerName="kube-state-metrics" containerID="cri-o://fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a" gracePeriod=30 Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.097905 4962 generic.go:334] "Generic (PLEG): container finished" podID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerID="fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a" exitCode=2 Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.098050 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc","Type":"ContainerDied","Data":"fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a"} Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.458352 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.567431 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") pod \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\" (UID: \"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc\") " Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.576672 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2" (OuterVolumeSpecName: "kube-api-access-kqfr2") pod "b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" (UID: "b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc"). InnerVolumeSpecName "kube-api-access-kqfr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:43 crc kubenswrapper[4962]: I0220 10:16:43.670493 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqfr2\" (UniqueName: \"kubernetes.io/projected/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc-kube-api-access-kqfr2\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.113041 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc","Type":"ContainerDied","Data":"13a063091fc03c82fcdfd2aec7fd6dc34c81370ecc1a377f3c146895e371bf9d"} Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.113538 4962 scope.go:117] "RemoveContainer" containerID="fb6cfde9cbec99e03a3f009355d709416019b4ef6eb2150c6b7e98f530e8b57a" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.113054 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.158216 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.181453 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.193440 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: E0220 10:16:44.194348 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerName="kube-state-metrics" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.194387 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerName="kube-state-metrics" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.194801 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" containerName="kube-state-metrics" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.200429 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.206088 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.213145 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.250828 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.285743 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.285874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.285913 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.286016 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.387396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.387507 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.387569 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.387629 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.393105 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.394492 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.402960 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.406794 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") pod \"kube-state-metrics-0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.553877 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.993189 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.993885 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-central-agent" containerID="cri-o://3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6" gracePeriod=30 Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.994068 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="proxy-httpd" containerID="cri-o://93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa" gracePeriod=30 Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.994122 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="sg-core" containerID="cri-o://e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5" gracePeriod=30 Feb 20 10:16:44 crc kubenswrapper[4962]: I0220 10:16:44.994157 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-notification-agent" containerID="cri-o://97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250" gracePeriod=30 Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.069716 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.163040 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc" path="/var/lib/kubelet/pods/b7df7b95-a5ed-4e4e-81f0-9f718bab0bcc/volumes" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.166416 4962 generic.go:334] "Generic (PLEG): container finished" podID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerID="9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40" exitCode=0 Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.166616 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerDied","Data":"9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40"} Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.173446 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cffca43e-3e19-4430-8fe2-ca7cfe6229b0","Type":"ContainerStarted","Data":"2851b19111bcc172daacd941571725296e0313b2b3496256066714262e7d3b9a"} Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.174782 4962 generic.go:334] "Generic (PLEG): container finished" podID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" exitCode=0 Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.174814 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea","Type":"ContainerDied","Data":"e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3"} Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.274881 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.312999 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.314315 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") pod \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.314380 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") pod \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.314659 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") pod \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.314740 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") pod \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\" (UID: \"5989ba7a-f1ca-4a25-a94a-3fea17f16eca\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.315074 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs" (OuterVolumeSpecName: "logs") pod "5989ba7a-f1ca-4a25-a94a-3fea17f16eca" (UID: "5989ba7a-f1ca-4a25-a94a-3fea17f16eca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.315320 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.330189 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6" (OuterVolumeSpecName: "kube-api-access-ckgt6") pod "5989ba7a-f1ca-4a25-a94a-3fea17f16eca" (UID: "5989ba7a-f1ca-4a25-a94a-3fea17f16eca"). InnerVolumeSpecName "kube-api-access-ckgt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.373494 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data" (OuterVolumeSpecName: "config-data") pod "5989ba7a-f1ca-4a25-a94a-3fea17f16eca" (UID: "5989ba7a-f1ca-4a25-a94a-3fea17f16eca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.420654 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckgt6\" (UniqueName: \"kubernetes.io/projected/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-kube-api-access-ckgt6\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.422399 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.454754 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5989ba7a-f1ca-4a25-a94a-3fea17f16eca" (UID: "5989ba7a-f1ca-4a25-a94a-3fea17f16eca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.491677 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.493885 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.525261 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") pod \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.525658 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") pod \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.525753 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.526179 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5989ba7a-f1ca-4a25-a94a-3fea17f16eca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.528650 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw" (OuterVolumeSpecName: "kube-api-access-s26dw") pod "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" (UID: "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea"). InnerVolumeSpecName "kube-api-access-s26dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: E0220 10:16:45.553571 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle podName:f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea nodeName:}" failed. No retries permitted until 2026-02-20 10:16:46.053537966 +0000 UTC m=+1297.636009812 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle") pod "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" (UID: "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea") : error deleting /var/lib/kubelet/pods/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea/volume-subpaths: remove /var/lib/kubelet/pods/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea/volume-subpaths: no such file or directory Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.556186 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data" (OuterVolumeSpecName: "config-data") pod "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" (UID: "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.628725 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:45 crc kubenswrapper[4962]: I0220 10:16:45.628783 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s26dw\" (UniqueName: \"kubernetes.io/projected/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-kube-api-access-s26dw\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.143606 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") pod \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\" (UID: \"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.149126 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" (UID: "f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.189625 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5989ba7a-f1ca-4a25-a94a-3fea17f16eca","Type":"ContainerDied","Data":"a7e2d55de2a0689981aefd86aae03373f74c3435dacca5e0c076361b42cb5da4"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.189695 4962 scope.go:117] "RemoveContainer" containerID="9a07dac43407609ca971ac0024f06768bd8740aa492e302e9d410517ce47da40" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.189821 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.198846 4962 generic.go:334] "Generic (PLEG): container finished" podID="457c772c-a7b8-40ea-8573-c483915687be" containerID="93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa" exitCode=0 Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.198882 4962 generic.go:334] "Generic (PLEG): container finished" podID="457c772c-a7b8-40ea-8573-c483915687be" containerID="e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5" exitCode=2 Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.198893 4962 generic.go:334] "Generic (PLEG): container finished" podID="457c772c-a7b8-40ea-8573-c483915687be" containerID="97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250" exitCode=0 Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.198901 4962 generic.go:334] "Generic (PLEG): container finished" podID="457c772c-a7b8-40ea-8573-c483915687be" containerID="3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6" exitCode=0 Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.199002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.199107 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.199124 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.199157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.201864 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.202732 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cffca43e-3e19-4430-8fe2-ca7cfe6229b0","Type":"ContainerStarted","Data":"490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.203553 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.206794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea","Type":"ContainerDied","Data":"503b6078fa852a67bd1c7bbebbc0925a0a08be36053fb3fee5407b0136117e50"} Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.206864 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.222130 4962 scope.go:117] "RemoveContainer" containerID="7c4df354c2a2dd24abe490f4720cd2638f9d36b40005018cdb03e89ec0d07cff" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.246710 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.246823 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.246911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247012 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247035 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247076 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247231 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") pod \"457c772c-a7b8-40ea-8573-c483915687be\" (UID: \"457c772c-a7b8-40ea-8573-c483915687be\") " Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.247953 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.249556 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.250694 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.261845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts" (OuterVolumeSpecName: "scripts") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.274584 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.287881 4962 scope.go:117] "RemoveContainer" containerID="e3508ab1e6394d844c8e66595b9b39cdfd62065415f11f38c66de88511cb92c3" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.287918 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p" (OuterVolumeSpecName: "kube-api-access-pg46p") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "kube-api-access-pg46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.302370 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.317760 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318416 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318436 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318448 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="proxy-httpd" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318457 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="proxy-httpd" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318486 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="sg-core" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318498 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="sg-core" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318513 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318522 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318556 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318564 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318579 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-central-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318604 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-central-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: E0220 10:16:46.318623 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-notification-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318634 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-notification-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318919 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="proxy-httpd" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318939 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" containerName="nova-scheduler-scheduler" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.318955 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="sg-core" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.319005 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-notification-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.319023 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="457c772c-a7b8-40ea-8573-c483915687be" containerName="ceilometer-central-agent" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.319036 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-log" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.319047 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" containerName="nova-api-api" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.320926 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.322831 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.334890 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.337054 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.692357641 podStartE2EDuration="2.337030953s" podCreationTimestamp="2026-02-20 10:16:44 +0000 UTC" firstStartedPulling="2026-02-20 10:16:45.077786785 +0000 UTC m=+1296.660258631" lastFinishedPulling="2026-02-20 10:16:45.722460077 +0000 UTC m=+1297.304931943" observedRunningTime="2026-02-20 10:16:46.288392575 +0000 UTC m=+1297.870864421" watchObservedRunningTime="2026-02-20 10:16:46.337030953 +0000 UTC m=+1297.919502799" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351432 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351469 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg46p\" (UniqueName: \"kubernetes.io/projected/457c772c-a7b8-40ea-8573-c483915687be-kube-api-access-pg46p\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351481 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351489 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/457c772c-a7b8-40ea-8573-c483915687be-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.351498 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.374349 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.383552 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.397424 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.413851 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.418717 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.422332 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.425550 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453624 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453674 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453814 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453840 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.453884 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.454880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data" (OuterVolumeSpecName: "config-data") pod "457c772c-a7b8-40ea-8573-c483915687be" (UID: "457c772c-a7b8-40ea-8573-c483915687be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.460567 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556807 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556870 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556924 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556946 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.556965 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.557042 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.557148 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457c772c-a7b8-40ea-8573-c483915687be-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.557691 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.563891 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.565279 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.573363 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") pod \"nova-api-0\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.659074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.659172 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.659240 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.665399 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.668904 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.675800 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.677257 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") pod \"nova-scheduler-0\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " pod="openstack/nova-scheduler-0" Feb 20 10:16:46 crc kubenswrapper[4962]: I0220 10:16:46.812022 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.157265 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5989ba7a-f1ca-4a25-a94a-3fea17f16eca" path="/var/lib/kubelet/pods/5989ba7a-f1ca-4a25-a94a-3fea17f16eca/volumes" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.157987 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea" path="/var/lib/kubelet/pods/f8c4cdc9-4aeb-4249-9c40-2f9c6bcb4cea/volumes" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.158744 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: W0220 10:16:47.165285 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a762e59_b6ef_4cdd_81f5_7f49dd78f810.slice/crio-483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1 WatchSource:0}: Error finding container 483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1: Status 404 returned error can't find the container with id 483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1 Feb 20 10:16:47 crc kubenswrapper[4962]: W0220 10:16:47.196821 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebe24b8d_7968_4806_a924_d932f167185f.slice/crio-e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e WatchSource:0}: Error finding container e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e: Status 404 returned error can't find the container with id e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.202464 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.220787 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"457c772c-a7b8-40ea-8573-c483915687be","Type":"ContainerDied","Data":"beaabcfa9360010688b1a11e6bc4e4b1e737fa90ac38511bb0966d375d500981"} Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.220814 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.221158 4962 scope.go:117] "RemoveContainer" containerID="93b1b3489e3c8062e10a81096649fbd732605d1f78d868a69f146c92e37a74fa" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.223794 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a762e59-b6ef-4cdd-81f5-7f49dd78f810","Type":"ContainerStarted","Data":"483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1"} Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.228005 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerStarted","Data":"e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e"} Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.246206 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.255894 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.259107 4962 scope.go:117] "RemoveContainer" containerID="e237fa9e8086988f9e78412b6b078a289e696808959ab9d107880a825fcf14c5" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.279758 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.282219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.284830 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.289752 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.290671 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.309348 4962 scope.go:117] "RemoveContainer" containerID="97e11a6c5e404ce1f4a50771ae7056ddfe4362a2679a8f06158ae96d84b4a250" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.309351 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376183 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376240 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376274 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376350 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376402 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376441 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376463 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.376504 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.377976 4962 scope.go:117] "RemoveContainer" containerID="3a1c7b6152c78c256920cd6dc450ddbd613db243bb2b91fc78e1eaf94400c2d6" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.477999 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478074 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478109 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478135 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478167 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478197 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478224 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478292 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478493 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.478715 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.482042 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.483663 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.483962 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.493928 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.499170 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.503201 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") pod \"ceilometer-0\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " pod="openstack/ceilometer-0" Feb 20 10:16:47 crc kubenswrapper[4962]: I0220 10:16:47.628139 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.171851 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:16:48 crc kubenswrapper[4962]: W0220 10:16:48.188945 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c35c04f_5ec6_44c4_99d5_38a896dcae17.slice/crio-9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1 WatchSource:0}: Error finding container 9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1: Status 404 returned error can't find the container with id 9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1 Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.253440 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1"} Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.279962 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a762e59-b6ef-4cdd-81f5-7f49dd78f810","Type":"ContainerStarted","Data":"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6"} Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.301542 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerStarted","Data":"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d"} Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.301611 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerStarted","Data":"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046"} Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.307899 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.307879074 podStartE2EDuration="2.307879074s" podCreationTimestamp="2026-02-20 10:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:48.305969434 +0000 UTC m=+1299.888441280" watchObservedRunningTime="2026-02-20 10:16:48.307879074 +0000 UTC m=+1299.890350920" Feb 20 10:16:48 crc kubenswrapper[4962]: I0220 10:16:48.331984 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.331957955 podStartE2EDuration="2.331957955s" podCreationTimestamp="2026-02-20 10:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:16:48.327076335 +0000 UTC m=+1299.909548181" watchObservedRunningTime="2026-02-20 10:16:48.331957955 +0000 UTC m=+1299.914429801" Feb 20 10:16:49 crc kubenswrapper[4962]: I0220 10:16:49.157108 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="457c772c-a7b8-40ea-8573-c483915687be" path="/var/lib/kubelet/pods/457c772c-a7b8-40ea-8573-c483915687be/volumes" Feb 20 10:16:49 crc kubenswrapper[4962]: I0220 10:16:49.327361 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc"} Feb 20 10:16:50 crc kubenswrapper[4962]: I0220 10:16:50.343961 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613"} Feb 20 10:16:50 crc kubenswrapper[4962]: I0220 10:16:50.491992 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 10:16:50 crc kubenswrapper[4962]: I0220 10:16:50.492094 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 10:16:50 crc kubenswrapper[4962]: I0220 10:16:50.585013 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 10:16:51 crc kubenswrapper[4962]: I0220 10:16:51.368559 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a"} Feb 20 10:16:51 crc kubenswrapper[4962]: I0220 10:16:51.508961 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:51 crc kubenswrapper[4962]: I0220 10:16:51.508975 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:51 crc kubenswrapper[4962]: I0220 10:16:51.812908 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 10:16:52 crc kubenswrapper[4962]: I0220 10:16:52.395729 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerStarted","Data":"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766"} Feb 20 10:16:52 crc kubenswrapper[4962]: I0220 10:16:52.397183 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:16:52 crc kubenswrapper[4962]: I0220 10:16:52.426062 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.856780937 podStartE2EDuration="5.42604068s" podCreationTimestamp="2026-02-20 10:16:47 +0000 UTC" firstStartedPulling="2026-02-20 10:16:48.195874565 +0000 UTC m=+1299.778346411" lastFinishedPulling="2026-02-20 10:16:51.765134308 +0000 UTC m=+1303.347606154" observedRunningTime="2026-02-20 10:16:52.421526801 +0000 UTC m=+1304.003998667" watchObservedRunningTime="2026-02-20 10:16:52.42604068 +0000 UTC m=+1304.008512526" Feb 20 10:16:54 crc kubenswrapper[4962]: I0220 10:16:54.570307 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 10:16:56 crc kubenswrapper[4962]: I0220 10:16:56.675987 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:16:56 crc kubenswrapper[4962]: I0220 10:16:56.676479 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:16:56 crc kubenswrapper[4962]: I0220 10:16:56.812477 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 10:16:56 crc kubenswrapper[4962]: I0220 10:16:56.865795 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 10:16:57 crc kubenswrapper[4962]: I0220 10:16:57.482401 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 10:16:57 crc kubenswrapper[4962]: I0220 10:16:57.758859 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 10:16:57 crc kubenswrapper[4962]: I0220 10:16:57.758878 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:00 crc kubenswrapper[4962]: I0220 10:17:00.503486 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 10:17:00 crc kubenswrapper[4962]: I0220 10:17:00.507199 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 10:17:00 crc kubenswrapper[4962]: I0220 10:17:00.512986 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 10:17:01 crc kubenswrapper[4962]: I0220 10:17:01.517104 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.432704 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.532664 4962 generic.go:334] "Generic (PLEG): container finished" podID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerID="a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" exitCode=137 Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.533678 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.534243 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1d60578e-e3d0-4ae9-8539-9dfd84ebf836","Type":"ContainerDied","Data":"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8"} Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.534338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1d60578e-e3d0-4ae9-8539-9dfd84ebf836","Type":"ContainerDied","Data":"bc1cbee44de6a5e482a759f896e04c3479b2c9a764a90e45d0331d2a12691504"} Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.534378 4962 scope.go:117] "RemoveContainer" containerID="a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.576552 4962 scope.go:117] "RemoveContainer" containerID="a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" Feb 20 10:17:03 crc kubenswrapper[4962]: E0220 10:17:03.577171 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8\": container with ID starting with a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8 not found: ID does not exist" containerID="a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.577272 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8"} err="failed to get container status \"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8\": rpc error: code = NotFound desc = could not find container \"a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8\": container with ID starting with a895a025c8e5f05c893f35ce1699e5268fa4ccdd49969d23e922563dcf7606a8 not found: ID does not exist" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.595670 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") pod \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.595891 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") pod \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.595993 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") pod \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\" (UID: \"1d60578e-e3d0-4ae9-8539-9dfd84ebf836\") " Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.606510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7" (OuterVolumeSpecName: "kube-api-access-8chs7") pod "1d60578e-e3d0-4ae9-8539-9dfd84ebf836" (UID: "1d60578e-e3d0-4ae9-8539-9dfd84ebf836"). InnerVolumeSpecName "kube-api-access-8chs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.634395 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d60578e-e3d0-4ae9-8539-9dfd84ebf836" (UID: "1d60578e-e3d0-4ae9-8539-9dfd84ebf836"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.646016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data" (OuterVolumeSpecName: "config-data") pod "1d60578e-e3d0-4ae9-8539-9dfd84ebf836" (UID: "1d60578e-e3d0-4ae9-8539-9dfd84ebf836"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.698993 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.699114 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.699127 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8chs7\" (UniqueName: \"kubernetes.io/projected/1d60578e-e3d0-4ae9-8539-9dfd84ebf836-kube-api-access-8chs7\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.899084 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.918410 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.928310 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:03 crc kubenswrapper[4962]: E0220 10:17:03.929110 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.929138 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.929463 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.930430 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.932508 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.932716 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.933573 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 20 10:17:03 crc kubenswrapper[4962]: I0220 10:17:03.941259 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.106661 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.106869 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.107165 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.107309 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.107390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209532 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209586 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209637 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209686 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.209760 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.215194 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.225211 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.231937 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.232854 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.248065 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.255669 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:04 crc kubenswrapper[4962]: I0220 10:17:04.806749 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:17:04 crc kubenswrapper[4962]: W0220 10:17:04.811467 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdf8f82d_76e8_4d49_ab1f_bc75cec4dc00.slice/crio-33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7 WatchSource:0}: Error finding container 33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7: Status 404 returned error can't find the container with id 33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7 Feb 20 10:17:05 crc kubenswrapper[4962]: I0220 10:17:05.164888 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d60578e-e3d0-4ae9-8539-9dfd84ebf836" path="/var/lib/kubelet/pods/1d60578e-e3d0-4ae9-8539-9dfd84ebf836/volumes" Feb 20 10:17:05 crc kubenswrapper[4962]: I0220 10:17:05.568262 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00","Type":"ContainerStarted","Data":"aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83"} Feb 20 10:17:05 crc kubenswrapper[4962]: I0220 10:17:05.568342 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00","Type":"ContainerStarted","Data":"33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7"} Feb 20 10:17:05 crc kubenswrapper[4962]: I0220 10:17:05.604348 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.604317628 podStartE2EDuration="2.604317628s" podCreationTimestamp="2026-02-20 10:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:05.59722178 +0000 UTC m=+1317.179693676" watchObservedRunningTime="2026-02-20 10:17:05.604317628 +0000 UTC m=+1317.186789514" Feb 20 10:17:06 crc kubenswrapper[4962]: I0220 10:17:06.686253 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 10:17:06 crc kubenswrapper[4962]: I0220 10:17:06.687445 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 10:17:06 crc kubenswrapper[4962]: I0220 10:17:06.688494 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 10:17:06 crc kubenswrapper[4962]: I0220 10:17:06.693870 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.599167 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.604226 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.837655 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.839470 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:07 crc kubenswrapper[4962]: I0220 10:17:07.859138 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014260 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014330 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014442 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014465 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014491 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.014517 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.116789 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.116850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.116968 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.116989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.117014 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.117044 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118268 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118331 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118344 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118704 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.118931 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.137283 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") pod \"dnsmasq-dns-58f6456c9f-hl7mw\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.188156 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:08 crc kubenswrapper[4962]: I0220 10:17:08.684790 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:17:08 crc kubenswrapper[4962]: W0220 10:17:08.688398 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e4f70a2_b8ae_48cc_a098_5642fad8b040.slice/crio-43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05 WatchSource:0}: Error finding container 43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05: Status 404 returned error can't find the container with id 43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.255786 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.617887 4962 generic.go:334] "Generic (PLEG): container finished" podID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerID="b6772b9162a6a32cfbe3b48349f45c3e39e34e153494f4b09b124b0a0f86db0c" exitCode=0 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.618048 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerDied","Data":"b6772b9162a6a32cfbe3b48349f45c3e39e34e153494f4b09b124b0a0f86db0c"} Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.618157 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerStarted","Data":"43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05"} Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.744638 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.745279 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-central-agent" containerID="cri-o://99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" gracePeriod=30 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.746213 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" containerID="cri-o://caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" gracePeriod=30 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.746354 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="sg-core" containerID="cri-o://0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" gracePeriod=30 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.746462 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-notification-agent" containerID="cri-o://22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" gracePeriod=30 Feb 20 10:17:09 crc kubenswrapper[4962]: I0220 10:17:09.765946 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.197:3000/\": read tcp 10.217.0.2:33678->10.217.0.197:3000: read: connection reset by peer" Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.259224 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637478 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerID="caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" exitCode=0 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637516 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerID="0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" exitCode=2 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637524 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerID="99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" exitCode=0 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637748 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766"} Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a"} Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.637798 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc"} Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.648375 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" containerID="cri-o://16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" gracePeriod=30 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.649972 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerStarted","Data":"337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28"} Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.650017 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.650479 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" containerID="cri-o://a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" gracePeriod=30 Feb 20 10:17:10 crc kubenswrapper[4962]: I0220 10:17:10.689209 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" podStartSLOduration=3.689186894 podStartE2EDuration="3.689186894s" podCreationTimestamp="2026-02-20 10:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:10.682697574 +0000 UTC m=+1322.265169440" watchObservedRunningTime="2026-02-20 10:17:10.689186894 +0000 UTC m=+1322.271658740" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.216025 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.401684 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402262 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402366 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402520 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.402604 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") pod \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\" (UID: \"9c35c04f-5ec6-44c4-99d5-38a896dcae17\") " Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.403237 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.403705 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.410235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz" (OuterVolumeSpecName: "kube-api-access-nmccz") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "kube-api-access-nmccz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.412724 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts" (OuterVolumeSpecName: "scripts") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.444680 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.463875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505279 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmccz\" (UniqueName: \"kubernetes.io/projected/9c35c04f-5ec6-44c4-99d5-38a896dcae17-kube-api-access-nmccz\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505315 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505328 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505338 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c35c04f-5ec6-44c4-99d5-38a896dcae17-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505347 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.505356 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.506735 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.508138 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.508195 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.508247 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.509196 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.509263 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716" gracePeriod=600 Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.544946 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data" (OuterVolumeSpecName: "config-data") pod "9c35c04f-5ec6-44c4-99d5-38a896dcae17" (UID: "9c35c04f-5ec6-44c4-99d5-38a896dcae17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.607673 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.607712 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c35c04f-5ec6-44c4-99d5-38a896dcae17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671107 4962 generic.go:334] "Generic (PLEG): container finished" podID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerID="22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" exitCode=0 Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613"} Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c35c04f-5ec6-44c4-99d5-38a896dcae17","Type":"ContainerDied","Data":"9fccdbfe279f8ca88785ee163c57e992e8a04cb5c4252ba831418d3d1e9460d1"} Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671281 4962 scope.go:117] "RemoveContainer" containerID="caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.671484 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.704127 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716"} Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.704078 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716" exitCode=0 Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.706754 4962 generic.go:334] "Generic (PLEG): container finished" podID="ebe24b8d-7968-4806-a924-d932f167185f" containerID="16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" exitCode=143 Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.708653 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerDied","Data":"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046"} Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.733842 4962 scope.go:117] "RemoveContainer" containerID="0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.741413 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.759267 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.770112 4962 scope.go:117] "RemoveContainer" containerID="22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.777855 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.778429 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-notification-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778450 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-notification-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.778474 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="sg-core" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778482 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="sg-core" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.778507 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778513 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.778533 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-central-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778539 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-central-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778800 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-notification-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778813 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="proxy-httpd" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778833 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="ceilometer-central-agent" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.778842 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" containerName="sg-core" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.781074 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.782878 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.784976 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.785136 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.801462 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.813553 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814287 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814405 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814427 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814536 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814563 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814671 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.814698 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.816805 4962 scope.go:117] "RemoveContainer" containerID="99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.842911 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.846780 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-zb8hn log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="3122f194-31cc-4b80-93ce-20c0ab55f4dd" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.855233 4962 scope.go:117] "RemoveContainer" containerID="caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.855734 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766\": container with ID starting with caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766 not found: ID does not exist" containerID="caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.857304 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766"} err="failed to get container status \"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766\": rpc error: code = NotFound desc = could not find container \"caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766\": container with ID starting with caaa74f92b5d0e328873252c56fead45ce48335008d190f32b745ac289e38766 not found: ID does not exist" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.857394 4962 scope.go:117] "RemoveContainer" containerID="0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.860060 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a\": container with ID starting with 0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a not found: ID does not exist" containerID="0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.860121 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a"} err="failed to get container status \"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a\": rpc error: code = NotFound desc = could not find container \"0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a\": container with ID starting with 0b888a786161fde438d48858a9d1b62d5a26425c6466484506b8de22270a664a not found: ID does not exist" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.860219 4962 scope.go:117] "RemoveContainer" containerID="22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.861308 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613\": container with ID starting with 22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613 not found: ID does not exist" containerID="22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.861372 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613"} err="failed to get container status \"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613\": rpc error: code = NotFound desc = could not find container \"22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613\": container with ID starting with 22e835daa3a58e115e60f7fd625eca37d4503a19c9fbbde6d0ac69f24df79613 not found: ID does not exist" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.861406 4962 scope.go:117] "RemoveContainer" containerID="99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" Feb 20 10:17:11 crc kubenswrapper[4962]: E0220 10:17:11.862517 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc\": container with ID starting with 99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc not found: ID does not exist" containerID="99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.862546 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc"} err="failed to get container status \"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc\": rpc error: code = NotFound desc = could not find container \"99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc\": container with ID starting with 99c6b0cf82c82c0b8a8f1fc52a81d6bcfa256ef193c37dfdf2b76832d2104fcc not found: ID does not exist" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.862567 4962 scope.go:117] "RemoveContainer" containerID="d1c3b246abfce789c57c63406e0ffd34b8624c7398251d713e463cbaf4c363e1" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915483 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915641 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915679 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915718 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915742 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915772 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915803 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.915824 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.917178 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.917320 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.922959 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.926392 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.926748 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.930613 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.938004 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:11 crc kubenswrapper[4962]: I0220 10:17:11.941712 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") pod \"ceilometer-0\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " pod="openstack/ceilometer-0" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.723561 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.723543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e"} Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.734477 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.834442 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836033 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836222 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836386 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836507 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836621 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836719 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.836873 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") pod \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\" (UID: \"3122f194-31cc-4b80-93ce-20c0ab55f4dd\") " Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.839101 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts" (OuterVolumeSpecName: "scripts") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.839351 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.841883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.842409 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.845956 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn" (OuterVolumeSpecName: "kube-api-access-zb8hn") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "kube-api-access-zb8hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.846651 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data" (OuterVolumeSpecName: "config-data") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.847904 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.848350 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3122f194-31cc-4b80-93ce-20c0ab55f4dd" (UID: "3122f194-31cc-4b80-93ce-20c0ab55f4dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939634 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939686 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb8hn\" (UniqueName: \"kubernetes.io/projected/3122f194-31cc-4b80-93ce-20c0ab55f4dd-kube-api-access-zb8hn\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939705 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939717 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939728 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3122f194-31cc-4b80-93ce-20c0ab55f4dd-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939738 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939752 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:12 crc kubenswrapper[4962]: I0220 10:17:12.939766 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3122f194-31cc-4b80-93ce-20c0ab55f4dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.154378 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c35c04f-5ec6-44c4-99d5-38a896dcae17" path="/var/lib/kubelet/pods/9c35c04f-5ec6-44c4-99d5-38a896dcae17/volumes" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.736861 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.831278 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.845930 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.856561 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.865585 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.868671 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.869217 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.869355 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.907786 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969645 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969699 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969832 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969882 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.969957 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.970023 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.970152 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:13 crc kubenswrapper[4962]: I0220 10:17:13.970242 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.071709 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.071808 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.071887 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.071992 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.072099 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.072190 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.072261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.072329 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.073252 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.077133 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.079631 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.080185 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.080729 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.100727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.100840 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.107464 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") pod \"ceilometer-0\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.249136 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.256612 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.286037 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.325884 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.379768 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") pod \"ebe24b8d-7968-4806-a924-d932f167185f\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.379894 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") pod \"ebe24b8d-7968-4806-a924-d932f167185f\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.380019 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") pod \"ebe24b8d-7968-4806-a924-d932f167185f\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.380148 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") pod \"ebe24b8d-7968-4806-a924-d932f167185f\" (UID: \"ebe24b8d-7968-4806-a924-d932f167185f\") " Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.383253 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs" (OuterVolumeSpecName: "logs") pod "ebe24b8d-7968-4806-a924-d932f167185f" (UID: "ebe24b8d-7968-4806-a924-d932f167185f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.388777 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw" (OuterVolumeSpecName: "kube-api-access-nzrbw") pod "ebe24b8d-7968-4806-a924-d932f167185f" (UID: "ebe24b8d-7968-4806-a924-d932f167185f"). InnerVolumeSpecName "kube-api-access-nzrbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.436792 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data" (OuterVolumeSpecName: "config-data") pod "ebe24b8d-7968-4806-a924-d932f167185f" (UID: "ebe24b8d-7968-4806-a924-d932f167185f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.447694 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebe24b8d-7968-4806-a924-d932f167185f" (UID: "ebe24b8d-7968-4806-a924-d932f167185f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.487398 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.487439 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebe24b8d-7968-4806-a924-d932f167185f-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.487452 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzrbw\" (UniqueName: \"kubernetes.io/projected/ebe24b8d-7968-4806-a924-d932f167185f-kube-api-access-nzrbw\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.487463 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe24b8d-7968-4806-a924-d932f167185f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.756480 4962 generic.go:334] "Generic (PLEG): container finished" podID="ebe24b8d-7968-4806-a924-d932f167185f" containerID="a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" exitCode=0 Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.757998 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.759016 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerDied","Data":"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d"} Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.759121 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ebe24b8d-7968-4806-a924-d932f167185f","Type":"ContainerDied","Data":"e64c9159ecdad906d9c9962019b5392b9e58a7c0e3d51ef95e7459a8b2298a1e"} Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.759252 4962 scope.go:117] "RemoveContainer" containerID="a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.800391 4962 scope.go:117] "RemoveContainer" containerID="16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.800649 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.810181 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:17:14 crc kubenswrapper[4962]: W0220 10:17:14.811808 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfae69c76_754d_4125_a405_23a3938e90a9.slice/crio-498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9 WatchSource:0}: Error finding container 498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9: Status 404 returned error can't find the container with id 498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9 Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.841227 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.853104 4962 scope.go:117] "RemoveContainer" containerID="a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" Feb 20 10:17:14 crc kubenswrapper[4962]: E0220 10:17:14.856691 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d\": container with ID starting with a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d not found: ID does not exist" containerID="a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.857144 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d"} err="failed to get container status \"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d\": rpc error: code = NotFound desc = could not find container \"a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d\": container with ID starting with a4781260e567499e59c61c3395733568454a97751d695160f009970b7852f29d not found: ID does not exist" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.857180 4962 scope.go:117] "RemoveContainer" containerID="16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" Feb 20 10:17:14 crc kubenswrapper[4962]: E0220 10:17:14.858884 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046\": container with ID starting with 16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046 not found: ID does not exist" containerID="16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.858980 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046"} err="failed to get container status \"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046\": rpc error: code = NotFound desc = could not find container \"16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046\": container with ID starting with 16ac13db628828db750794374323308f29b229d310a915dc23b1545983fc0046 not found: ID does not exist" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.867506 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.879666 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:14 crc kubenswrapper[4962]: E0220 10:17:14.880241 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.880268 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" Feb 20 10:17:14 crc kubenswrapper[4962]: E0220 10:17:14.880316 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.880323 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.880527 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-log" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.880559 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe24b8d-7968-4806-a924-d932f167185f" containerName="nova-api-api" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.882227 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.887078 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.887408 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.887795 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896086 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896142 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896235 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896284 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.896320 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:14 crc kubenswrapper[4962]: I0220 10:17:14.958370 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999218 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999322 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999350 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999376 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999431 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:14.999483 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.004658 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.009407 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.016387 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.017172 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.038304 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.042206 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") pod \"nova-api-0\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.118670 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.120226 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.126037 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.126273 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.128814 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.174705 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3122f194-31cc-4b80-93ce-20c0ab55f4dd" path="/var/lib/kubelet/pods/3122f194-31cc-4b80-93ce-20c0ab55f4dd/volumes" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.175474 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe24b8d-7968-4806-a924-d932f167185f" path="/var/lib/kubelet/pods/ebe24b8d-7968-4806-a924-d932f167185f/volumes" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.205553 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.205818 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.205925 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.205955 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.251626 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.306745 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.306850 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.306906 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.306934 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.311571 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.312157 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.318051 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.328964 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") pod \"nova-cell1-cell-mapping-mm68z\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.459213 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.742695 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.774568 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a"} Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.774635 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9"} Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.777753 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerStarted","Data":"125fe601dfdf6769c35ec31a4db3fb414e225c1f0afbec478eb5d8be4fdc6a86"} Feb 20 10:17:15 crc kubenswrapper[4962]: I0220 10:17:15.947871 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:17:15 crc kubenswrapper[4962]: W0220 10:17:15.959962 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39a7b81e_d4af_478f_b2c3_d21f117ad7ec.slice/crio-2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364 WatchSource:0}: Error finding container 2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364: Status 404 returned error can't find the container with id 2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364 Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.789477 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.792922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mm68z" event={"ID":"39a7b81e-d4af-478f-b2c3-d21f117ad7ec","Type":"ContainerStarted","Data":"f9e4860c3043e0b48490e065c36e81c4bc365aa4ac0725e20676491c7054e577"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.792980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mm68z" event={"ID":"39a7b81e-d4af-478f-b2c3-d21f117ad7ec","Type":"ContainerStarted","Data":"2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.795109 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerStarted","Data":"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.795164 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerStarted","Data":"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06"} Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.812275 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mm68z" podStartSLOduration=1.81225799 podStartE2EDuration="1.81225799s" podCreationTimestamp="2026-02-20 10:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:16.81097792 +0000 UTC m=+1328.393449766" watchObservedRunningTime="2026-02-20 10:17:16.81225799 +0000 UTC m=+1328.394729836" Feb 20 10:17:16 crc kubenswrapper[4962]: I0220 10:17:16.838350 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.838328473 podStartE2EDuration="2.838328473s" podCreationTimestamp="2026-02-20 10:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:16.832669029 +0000 UTC m=+1328.415140875" watchObservedRunningTime="2026-02-20 10:17:16.838328473 +0000 UTC m=+1328.420800319" Feb 20 10:17:17 crc kubenswrapper[4962]: I0220 10:17:17.807653 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e"} Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.189792 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.277393 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.278343 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="dnsmasq-dns" containerID="cri-o://3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" gracePeriod=10 Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.789058 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821484 4962 generic.go:334] "Generic (PLEG): container finished" podID="619a1578-177c-476f-a471-e39ec43ebf20" containerID="3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" exitCode=0 Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821606 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerDied","Data":"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858"} Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821648 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" event={"ID":"619a1578-177c-476f-a471-e39ec43ebf20","Type":"ContainerDied","Data":"323c0906415aaaf20c526ccc0a5760d7fccfd56336d524a538bc100ce5c3c6b2"} Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821671 4962 scope.go:117] "RemoveContainer" containerID="3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.821813 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-6w4jk" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.828748 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerStarted","Data":"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5"} Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.828960 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853161 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853242 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853409 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853447 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853517 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.853616 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") pod \"619a1578-177c-476f-a471-e39ec43ebf20\" (UID: \"619a1578-177c-476f-a471-e39ec43ebf20\") " Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.865060 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj" (OuterVolumeSpecName: "kube-api-access-fnccj") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "kube-api-access-fnccj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.867151 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5195883439999998 podStartE2EDuration="5.867121649s" podCreationTimestamp="2026-02-20 10:17:13 +0000 UTC" firstStartedPulling="2026-02-20 10:17:14.856246466 +0000 UTC m=+1326.438718312" lastFinishedPulling="2026-02-20 10:17:18.203779771 +0000 UTC m=+1329.786251617" observedRunningTime="2026-02-20 10:17:18.856020827 +0000 UTC m=+1330.438492673" watchObservedRunningTime="2026-02-20 10:17:18.867121649 +0000 UTC m=+1330.449593495" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.869644 4962 scope.go:117] "RemoveContainer" containerID="6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.890465 4962 scope.go:117] "RemoveContainer" containerID="3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" Feb 20 10:17:18 crc kubenswrapper[4962]: E0220 10:17:18.891063 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858\": container with ID starting with 3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858 not found: ID does not exist" containerID="3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.891119 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858"} err="failed to get container status \"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858\": rpc error: code = NotFound desc = could not find container \"3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858\": container with ID starting with 3630a9744e496f0e77b1e7fe2b46b1296585e37a881d13f49c24da70a4548858 not found: ID does not exist" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.891154 4962 scope.go:117] "RemoveContainer" containerID="6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a" Feb 20 10:17:18 crc kubenswrapper[4962]: E0220 10:17:18.891570 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a\": container with ID starting with 6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a not found: ID does not exist" containerID="6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.891606 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a"} err="failed to get container status \"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a\": rpc error: code = NotFound desc = could not find container \"6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a\": container with ID starting with 6fd2452641fc166c659c0aca31c5f68dd1c22702b25d2cc7444e617e2b88482a not found: ID does not exist" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.923738 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config" (OuterVolumeSpecName: "config") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.923748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.927725 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.929006 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.937615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "619a1578-177c-476f-a471-e39ec43ebf20" (UID: "619a1578-177c-476f-a471-e39ec43ebf20"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956102 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956156 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956170 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956187 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956199 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/619a1578-177c-476f-a471-e39ec43ebf20-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:18 crc kubenswrapper[4962]: I0220 10:17:18.956208 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnccj\" (UniqueName: \"kubernetes.io/projected/619a1578-177c-476f-a471-e39ec43ebf20-kube-api-access-fnccj\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:19 crc kubenswrapper[4962]: I0220 10:17:19.159455 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:17:19 crc kubenswrapper[4962]: I0220 10:17:19.170774 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-6w4jk"] Feb 20 10:17:21 crc kubenswrapper[4962]: I0220 10:17:21.167454 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619a1578-177c-476f-a471-e39ec43ebf20" path="/var/lib/kubelet/pods/619a1578-177c-476f-a471-e39ec43ebf20/volumes" Feb 20 10:17:21 crc kubenswrapper[4962]: I0220 10:17:21.892307 4962 generic.go:334] "Generic (PLEG): container finished" podID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" containerID="f9e4860c3043e0b48490e065c36e81c4bc365aa4ac0725e20676491c7054e577" exitCode=0 Feb 20 10:17:21 crc kubenswrapper[4962]: I0220 10:17:21.892413 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mm68z" event={"ID":"39a7b81e-d4af-478f-b2c3-d21f117ad7ec","Type":"ContainerDied","Data":"f9e4860c3043e0b48490e065c36e81c4bc365aa4ac0725e20676491c7054e577"} Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.344971 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.382853 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") pod \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.383109 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") pod \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.383368 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") pod \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.383555 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") pod \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\" (UID: \"39a7b81e-d4af-478f-b2c3-d21f117ad7ec\") " Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.416270 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts" (OuterVolumeSpecName: "scripts") pod "39a7b81e-d4af-478f-b2c3-d21f117ad7ec" (UID: "39a7b81e-d4af-478f-b2c3-d21f117ad7ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.416748 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr" (OuterVolumeSpecName: "kube-api-access-xrtxr") pod "39a7b81e-d4af-478f-b2c3-d21f117ad7ec" (UID: "39a7b81e-d4af-478f-b2c3-d21f117ad7ec"). InnerVolumeSpecName "kube-api-access-xrtxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.423251 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39a7b81e-d4af-478f-b2c3-d21f117ad7ec" (UID: "39a7b81e-d4af-478f-b2c3-d21f117ad7ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.443284 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data" (OuterVolumeSpecName: "config-data") pod "39a7b81e-d4af-478f-b2c3-d21f117ad7ec" (UID: "39a7b81e-d4af-478f-b2c3-d21f117ad7ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.485834 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.485873 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrtxr\" (UniqueName: \"kubernetes.io/projected/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-kube-api-access-xrtxr\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.485889 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.485900 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a7b81e-d4af-478f-b2c3-d21f117ad7ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.922515 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mm68z" event={"ID":"39a7b81e-d4af-478f-b2c3-d21f117ad7ec","Type":"ContainerDied","Data":"2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364"} Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.922565 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2de6d4630699b7dec16308da621d16bd7f9b38bab3938ab494efec264fd7f364" Feb 20 10:17:23 crc kubenswrapper[4962]: I0220 10:17:23.922663 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mm68z" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.159893 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.160476 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-log" containerID="cri-o://2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.160655 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-api" containerID="cri-o://c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.174263 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.174505 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerName="nova-scheduler-scheduler" containerID="cri-o://8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.216108 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.216875 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" containerID="cri-o://0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.217429 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" containerID="cri-o://a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" gracePeriod=30 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.818668 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.921109 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.921238 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.922342 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.922435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.922543 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.922776 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") pod \"8ede4992-1b80-4f08-a232-84f283cfedde\" (UID: \"8ede4992-1b80-4f08-a232-84f283cfedde\") " Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.923182 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs" (OuterVolumeSpecName: "logs") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.923494 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ede4992-1b80-4f08-a232-84f283cfedde-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.930883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs" (OuterVolumeSpecName: "kube-api-access-xtxfs") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "kube-api-access-xtxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.937557 4962 generic.go:334] "Generic (PLEG): container finished" podID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerID="0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" exitCode=143 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.937636 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerDied","Data":"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e"} Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940661 4962 generic.go:334] "Generic (PLEG): container finished" podID="8ede4992-1b80-4f08-a232-84f283cfedde" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" exitCode=0 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940692 4962 generic.go:334] "Generic (PLEG): container finished" podID="8ede4992-1b80-4f08-a232-84f283cfedde" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" exitCode=143 Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940718 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerDied","Data":"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4"} Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerDied","Data":"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06"} Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940761 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ede4992-1b80-4f08-a232-84f283cfedde","Type":"ContainerDied","Data":"125fe601dfdf6769c35ec31a4db3fb414e225c1f0afbec478eb5d8be4fdc6a86"} Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940778 4962 scope.go:117] "RemoveContainer" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.940819 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.953988 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.954737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data" (OuterVolumeSpecName: "config-data") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.980514 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:24 crc kubenswrapper[4962]: I0220 10:17:24.995782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ede4992-1b80-4f08-a232-84f283cfedde" (UID: "8ede4992-1b80-4f08-a232-84f283cfedde"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.006930 4962 scope.go:117] "RemoveContainer" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025524 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025564 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtxfs\" (UniqueName: \"kubernetes.io/projected/8ede4992-1b80-4f08-a232-84f283cfedde-kube-api-access-xtxfs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025579 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025609 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.025622 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ede4992-1b80-4f08-a232-84f283cfedde-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.028366 4962 scope.go:117] "RemoveContainer" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.028767 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": container with ID starting with c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4 not found: ID does not exist" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.028810 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4"} err="failed to get container status \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": rpc error: code = NotFound desc = could not find container \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": container with ID starting with c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4 not found: ID does not exist" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.028839 4962 scope.go:117] "RemoveContainer" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.029362 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": container with ID starting with 2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06 not found: ID does not exist" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.029393 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06"} err="failed to get container status \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": rpc error: code = NotFound desc = could not find container \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": container with ID starting with 2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06 not found: ID does not exist" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.029411 4962 scope.go:117] "RemoveContainer" containerID="c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.029720 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4"} err="failed to get container status \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": rpc error: code = NotFound desc = could not find container \"c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4\": container with ID starting with c7c51a5ddb5719d1788010d50cf855f08122f4f46a5c8fa9448ae3b2b6b851a4 not found: ID does not exist" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.029743 4962 scope.go:117] "RemoveContainer" containerID="2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.030013 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06"} err="failed to get container status \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": rpc error: code = NotFound desc = could not find container \"2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06\": container with ID starting with 2d77c58adaf530047a1185fa5b97807506294dea924b607ee985f272be6c4f06 not found: ID does not exist" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.275411 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.294229 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.306175 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.306909 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" containerName="nova-manage" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.306979 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" containerName="nova-manage" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.307051 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-api" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307104 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-api" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.307165 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="init" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307216 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="init" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.307274 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-log" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307332 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-log" Feb 20 10:17:25 crc kubenswrapper[4962]: E0220 10:17:25.307412 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="dnsmasq-dns" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307465 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="dnsmasq-dns" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307732 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" containerName="nova-manage" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307807 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-log" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307889 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" containerName="nova-api-api" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.307953 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="619a1578-177c-476f-a471-e39ec43ebf20" containerName="dnsmasq-dns" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.309012 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.311948 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.312795 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.313031 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.340183 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.436739 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.436961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.437016 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.437393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.437488 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.437636 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.541137 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.541758 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.542669 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.542792 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.543002 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.543049 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.541699 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.547454 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.549196 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.549906 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.555413 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.586111 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") pod \"nova-api-0\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " pod="openstack/nova-api-0" Feb 20 10:17:25 crc kubenswrapper[4962]: I0220 10:17:25.655675 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.238173 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:17:26 crc kubenswrapper[4962]: W0220 10:17:26.245358 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod241dc417_3176_4051_ad4e_d98f4f66ddc2.slice/crio-6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe WatchSource:0}: Error finding container 6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe: Status 404 returned error can't find the container with id 6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.433317 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.572066 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") pod \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.572715 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") pod \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.572848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") pod \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\" (UID: \"2a762e59-b6ef-4cdd-81f5-7f49dd78f810\") " Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.576510 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh" (OuterVolumeSpecName: "kube-api-access-v2xsh") pod "2a762e59-b6ef-4cdd-81f5-7f49dd78f810" (UID: "2a762e59-b6ef-4cdd-81f5-7f49dd78f810"). InnerVolumeSpecName "kube-api-access-v2xsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.606069 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data" (OuterVolumeSpecName: "config-data") pod "2a762e59-b6ef-4cdd-81f5-7f49dd78f810" (UID: "2a762e59-b6ef-4cdd-81f5-7f49dd78f810"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.609443 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a762e59-b6ef-4cdd-81f5-7f49dd78f810" (UID: "2a762e59-b6ef-4cdd-81f5-7f49dd78f810"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.675617 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.675673 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.675691 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2xsh\" (UniqueName: \"kubernetes.io/projected/2a762e59-b6ef-4cdd-81f5-7f49dd78f810-kube-api-access-v2xsh\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.968624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerStarted","Data":"42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.968671 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerStarted","Data":"d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.968688 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerStarted","Data":"6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971328 4962 generic.go:334] "Generic (PLEG): container finished" podID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerID="8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" exitCode=0 Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a762e59-b6ef-4cdd-81f5-7f49dd78f810","Type":"ContainerDied","Data":"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971393 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a762e59-b6ef-4cdd-81f5-7f49dd78f810","Type":"ContainerDied","Data":"483d238cd5d776f09a0edf06552294d8201c2f1a4a094bae70c03d01ceee2bb1"} Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971411 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:17:26 crc kubenswrapper[4962]: I0220 10:17:26.971429 4962 scope.go:117] "RemoveContainer" containerID="8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.001959 4962 scope.go:117] "RemoveContainer" containerID="8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" Feb 20 10:17:27 crc kubenswrapper[4962]: E0220 10:17:27.006863 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6\": container with ID starting with 8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6 not found: ID does not exist" containerID="8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.006915 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6"} err="failed to get container status \"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6\": rpc error: code = NotFound desc = could not find container \"8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6\": container with ID starting with 8a4e849a67446e74bb7d851638980bf9f47f764932e6e5dbd43efc378ccb51b6 not found: ID does not exist" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.049238 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.049212832 podStartE2EDuration="2.049212832s" podCreationTimestamp="2026-02-20 10:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:27.007948362 +0000 UTC m=+1338.590420238" watchObservedRunningTime="2026-02-20 10:17:27.049212832 +0000 UTC m=+1338.631684688" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.052491 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.092646 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.107354 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:27 crc kubenswrapper[4962]: E0220 10:17:27.107899 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerName="nova-scheduler-scheduler" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.107918 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerName="nova-scheduler-scheduler" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.108104 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" containerName="nova-scheduler-scheduler" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.108908 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.116293 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.122375 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.199646 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.200117 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.200153 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.206851 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a762e59-b6ef-4cdd-81f5-7f49dd78f810" path="/var/lib/kubelet/pods/2a762e59-b6ef-4cdd-81f5-7f49dd78f810/volumes" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.213286 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ede4992-1b80-4f08-a232-84f283cfedde" path="/var/lib/kubelet/pods/8ede4992-1b80-4f08-a232-84f283cfedde/volumes" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.301960 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.302037 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.302124 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.312485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.312854 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.329836 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") pod \"nova-scheduler-0\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.389511 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:41048->10.217.0.192:8775: read: connection reset by peer" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.389486 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:41060->10.217.0.192:8775: read: connection reset by peer" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.439850 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.800174 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916512 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916606 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916806 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916899 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.916922 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") pod \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\" (UID: \"bf680b24-e6dc-40a4-9ee4-521343fd9a28\") " Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.918047 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs" (OuterVolumeSpecName: "logs") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.922583 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc" (OuterVolumeSpecName: "kube-api-access-58msc") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "kube-api-access-58msc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.955425 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data" (OuterVolumeSpecName: "config-data") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.961699 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.977417 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bf680b24-e6dc-40a4-9ee4-521343fd9a28" (UID: "bf680b24-e6dc-40a4-9ee4-521343fd9a28"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992097 4962 generic.go:334] "Generic (PLEG): container finished" podID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerID="a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" exitCode=0 Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992207 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerDied","Data":"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e"} Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992264 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf680b24-e6dc-40a4-9ee4-521343fd9a28","Type":"ContainerDied","Data":"121ed408ee75be32f31a1f4dc7577730e020e69ee605e01bdd02274a3aab2f53"} Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992292 4962 scope.go:117] "RemoveContainer" containerID="a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.992248 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:17:27 crc kubenswrapper[4962]: I0220 10:17:27.999776 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: W0220 10:17:28.009348 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd02115_2eb9_4090_8225_108c3a8cad20.slice/crio-a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde WatchSource:0}: Error finding container a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde: Status 404 returned error can't find the container with id a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019561 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019614 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58msc\" (UniqueName: \"kubernetes.io/projected/bf680b24-e6dc-40a4-9ee4-521343fd9a28-kube-api-access-58msc\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019628 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019646 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf680b24-e6dc-40a4-9ee4-521343fd9a28-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.019666 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf680b24-e6dc-40a4-9ee4-521343fd9a28-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.040345 4962 scope.go:117] "RemoveContainer" containerID="0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.071891 4962 scope.go:117] "RemoveContainer" containerID="a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" Feb 20 10:17:28 crc kubenswrapper[4962]: E0220 10:17:28.076854 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e\": container with ID starting with a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e not found: ID does not exist" containerID="a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.076924 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e"} err="failed to get container status \"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e\": rpc error: code = NotFound desc = could not find container \"a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e\": container with ID starting with a2a1cf05c6cde763cbf0e416c61a88274821349634d663e3dc31de4c5f75317e not found: ID does not exist" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.076965 4962 scope.go:117] "RemoveContainer" containerID="0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" Feb 20 10:17:28 crc kubenswrapper[4962]: E0220 10:17:28.077345 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e\": container with ID starting with 0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e not found: ID does not exist" containerID="0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.077372 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e"} err="failed to get container status \"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e\": rpc error: code = NotFound desc = could not find container \"0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e\": container with ID starting with 0fc17421016c0a51f07f032e01424e9a427398b88ed8a196a9b8eaf3af4e366e not found: ID does not exist" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.080412 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.099827 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.113710 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: E0220 10:17:28.114114 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.114131 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" Feb 20 10:17:28 crc kubenswrapper[4962]: E0220 10:17:28.114157 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.114165 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.114347 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-log" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.114380 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" containerName="nova-metadata-metadata" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.115357 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.118457 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.119561 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.121233 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224766 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224827 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224931 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224956 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.224986 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.327908 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.329996 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.330070 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.330294 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.330371 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.332362 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.334996 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.336150 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.338735 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.352799 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") pod \"nova-metadata-0\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " pod="openstack/nova-metadata-0" Feb 20 10:17:28 crc kubenswrapper[4962]: I0220 10:17:28.437786 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:28.919829 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:17:29 crc kubenswrapper[4962]: W0220 10:17:28.927967 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca793428_98ed_4f82_aa57_31d6671d546c.slice/crio-814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4 WatchSource:0}: Error finding container 814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4: Status 404 returned error can't find the container with id 814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4 Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.011660 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcd02115-2eb9-4090-8225-108c3a8cad20","Type":"ContainerStarted","Data":"24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666"} Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.011731 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcd02115-2eb9-4090-8225-108c3a8cad20","Type":"ContainerStarted","Data":"a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde"} Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.014624 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerStarted","Data":"814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4"} Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.044488 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.044465055 podStartE2EDuration="2.044465055s" podCreationTimestamp="2026-02-20 10:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:29.037330455 +0000 UTC m=+1340.619802341" watchObservedRunningTime="2026-02-20 10:17:29.044465055 +0000 UTC m=+1340.626936901" Feb 20 10:17:29 crc kubenswrapper[4962]: I0220 10:17:29.166295 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf680b24-e6dc-40a4-9ee4-521343fd9a28" path="/var/lib/kubelet/pods/bf680b24-e6dc-40a4-9ee4-521343fd9a28/volumes" Feb 20 10:17:30 crc kubenswrapper[4962]: I0220 10:17:30.035546 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerStarted","Data":"2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b"} Feb 20 10:17:30 crc kubenswrapper[4962]: I0220 10:17:30.036145 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerStarted","Data":"fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89"} Feb 20 10:17:30 crc kubenswrapper[4962]: I0220 10:17:30.079288 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.07925126 podStartE2EDuration="2.07925126s" podCreationTimestamp="2026-02-20 10:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:17:30.075190226 +0000 UTC m=+1341.657662152" watchObservedRunningTime="2026-02-20 10:17:30.07925126 +0000 UTC m=+1341.661723167" Feb 20 10:17:32 crc kubenswrapper[4962]: I0220 10:17:32.440334 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 10:17:33 crc kubenswrapper[4962]: I0220 10:17:33.438366 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 10:17:33 crc kubenswrapper[4962]: I0220 10:17:33.438882 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 10:17:35 crc kubenswrapper[4962]: I0220 10:17:35.656401 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:17:35 crc kubenswrapper[4962]: I0220 10:17:35.656925 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 10:17:36 crc kubenswrapper[4962]: I0220 10:17:36.672835 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:36 crc kubenswrapper[4962]: I0220 10:17:36.672907 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:37 crc kubenswrapper[4962]: I0220 10:17:37.440889 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 10:17:37 crc kubenswrapper[4962]: I0220 10:17:37.475655 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 10:17:38 crc kubenswrapper[4962]: I0220 10:17:38.195165 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 10:17:38 crc kubenswrapper[4962]: I0220 10:17:38.438354 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 10:17:38 crc kubenswrapper[4962]: I0220 10:17:38.438434 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 10:17:39 crc kubenswrapper[4962]: I0220 10:17:39.450746 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:39 crc kubenswrapper[4962]: I0220 10:17:39.450825 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 10:17:44 crc kubenswrapper[4962]: I0220 10:17:44.266157 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 10:17:45 crc kubenswrapper[4962]: I0220 10:17:45.671893 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 10:17:45 crc kubenswrapper[4962]: I0220 10:17:45.672880 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 10:17:45 crc kubenswrapper[4962]: I0220 10:17:45.674695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 10:17:45 crc kubenswrapper[4962]: I0220 10:17:45.683881 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 10:17:46 crc kubenswrapper[4962]: I0220 10:17:46.271425 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 10:17:46 crc kubenswrapper[4962]: I0220 10:17:46.281410 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 10:17:48 crc kubenswrapper[4962]: I0220 10:17:48.448998 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 10:17:48 crc kubenswrapper[4962]: I0220 10:17:48.452533 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 10:17:48 crc kubenswrapper[4962]: I0220 10:17:48.459795 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 10:17:49 crc kubenswrapper[4962]: I0220 10:17:49.316088 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 10:18:08 crc kubenswrapper[4962]: I0220 10:18:08.847732 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 20 10:18:08 crc kubenswrapper[4962]: I0220 10:18:08.849644 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="755ca463-8c62-402c-8a88-a066fb38b521" containerName="openstackclient" containerID="cri-o://58314faa8bcfe5f5f7afbcc99e392370d5f2737c5567814db10eda41512d6621" gracePeriod=2 Feb 20 10:18:08 crc kubenswrapper[4962]: I0220 10:18:08.888024 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.000483 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:09 crc kubenswrapper[4962]: E0220 10:18:09.000984 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755ca463-8c62-402c-8a88-a066fb38b521" containerName="openstackclient" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.000997 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="755ca463-8c62-402c-8a88-a066fb38b521" containerName="openstackclient" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.001187 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="755ca463-8c62-402c-8a88-a066fb38b521" containerName="openstackclient" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.015044 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.025456 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.041677 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.077231 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.082343 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.082425 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.119845 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ptczd"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.133916 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.183505 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.183922 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.184824 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.193766 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598e051e-58af-4a1a-aa46-7f88d635f34c" path="/var/lib/kubelet/pods/598e051e-58af-4a1a-aa46-7f88d635f34c/volumes" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.194532 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c46d-account-create-update-44g6w"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.271781 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.273351 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.303451 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.303726 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.309098 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.322435 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") pod \"root-account-create-update-6f6vb\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.323658 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.369613 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.412359 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.412502 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.413320 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-11b3-account-create-update-x5n92"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.413632 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.460128 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.525177 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.528100 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") pod \"barbican-c46d-account-create-update-gfqts\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.582704 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.603402 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.603715 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-k7csj" podUID="88c21489-524e-4ee7-a340-5be2573af161" containerName="openstack-network-exporter" containerID="cri-o://c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3" gracePeriod=30 Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.622670 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.623228 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" containerID="cri-o://095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" gracePeriod=30 Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.623479 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="openstack-network-exporter" containerID="cri-o://0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" gracePeriod=30 Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.695046 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:18:09 crc kubenswrapper[4962]: E0220 10:18:09.735706 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:09 crc kubenswrapper[4962]: E0220 10:18:09.735790 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:10.235764585 +0000 UTC m=+1381.818236431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.835355 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.904520 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:18:09 crc kubenswrapper[4962]: I0220 10:18:09.960907 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7588-account-create-update-6ttfz"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.146898 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.177841 4962 generic.go:334] "Generic (PLEG): container finished" podID="33d73a04-08b2-4944-861f-749a63c2565d" containerID="0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" exitCode=2 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.177986 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerDied","Data":"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453"} Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.186585 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mk67n"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.212450 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-k7csj_88c21489-524e-4ee7-a340-5be2573af161/openstack-network-exporter/0.log" Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.212502 4962 generic.go:334] "Generic (PLEG): container finished" podID="88c21489-524e-4ee7-a340-5be2573af161" containerID="c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3" exitCode=2 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.212536 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7csj" event={"ID":"88c21489-524e-4ee7-a340-5be2573af161","Type":"ContainerDied","Data":"c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3"} Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.221885 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.273721 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.313761 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e96c-account-create-update-zd8bf"] Feb 20 10:18:10 crc kubenswrapper[4962]: E0220 10:18:10.316773 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:10 crc kubenswrapper[4962]: E0220 10:18:10.316838 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:11.316816615 +0000 UTC m=+1382.899288461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.361653 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.403778 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-s4qgr"] Feb 20 10:18:10 crc kubenswrapper[4962]: E0220 10:18:10.457885 4962 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-wj9f6" message=< Feb 20 10:18:10 crc kubenswrapper[4962]: Exiting ovn-controller (1) [ OK ] Feb 20 10:18:10 crc kubenswrapper[4962]: > Feb 20 10:18:10 crc kubenswrapper[4962]: E0220 10:18:10.457933 4962 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-wj9f6" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" containerID="cri-o://d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8" Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.457976 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-wj9f6" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" containerID="cri-o://d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8" gracePeriod=30 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.458413 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.523695 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-smcqr"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.598717 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-695f-account-create-update-22t44"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.621683 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.622521 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="openstack-network-exporter" containerID="cri-o://a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8" gracePeriod=300 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.657152 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.657694 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="dnsmasq-dns" containerID="cri-o://337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28" gracePeriod=10 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.704942 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.792558 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7729-account-create-update-dttxs"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.809917 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.888567 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9gcrq"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.903569 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="ovsdbserver-sb" containerID="cri-o://9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712" gracePeriod=300 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.919581 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.923412 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-v7sjh"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.952619 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.953362 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="openstack-network-exporter" containerID="cri-o://e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed" gracePeriod=300 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.978468 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.978761 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-687f4cff74-gmh4w" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" containerID="cri-o://4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" gracePeriod=30 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.981259 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-687f4cff74-gmh4w" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" containerID="cri-o://eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" gracePeriod=30 Feb 20 10:18:10 crc kubenswrapper[4962]: I0220 10:18:10.990784 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.030327 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.030581 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="cinder-scheduler" containerID="cri-o://f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.034029 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="probe" containerID="cri-o://c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.084685 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-9mznb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.140102 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.232905 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="ovsdbserver-nb" containerID="cri-o://b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48" gracePeriod=300 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.295480 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c237ea-eb42-49d4-90db-ee57e3b560e3" path="/var/lib/kubelet/pods/14c237ea-eb42-49d4-90db-ee57e3b560e3/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.296865 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21296df9-6e67-4427-959d-8d67bfd1393b" path="/var/lib/kubelet/pods/21296df9-6e67-4427-959d-8d67bfd1393b/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.297839 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e7338a7-4012-439d-b961-6ca0c55dd6e6" path="/var/lib/kubelet/pods/2e7338a7-4012-439d-b961-6ca0c55dd6e6/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.298425 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1" path="/var/lib/kubelet/pods/3d5b927e-e69b-4f2d-b8e6-de43bab2f6f1/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.302001 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4feedd65-778f-471c-a2bf-23af2e459685" path="/var/lib/kubelet/pods/4feedd65-778f-471c-a2bf-23af2e459685/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.302750 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b114dbd-1f72-42c9-97c1-43795d1cf1ea" path="/var/lib/kubelet/pods/6b114dbd-1f72-42c9-97c1-43795d1cf1ea/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.305055 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2005e0-31d4-408f-8c66-187a6dd37bcd" path="/var/lib/kubelet/pods/7e2005e0-31d4-408f-8c66-187a6dd37bcd/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.351159 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.382059 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f50d98-6178-44d4-8ac4-43a8df4e3339" path="/var/lib/kubelet/pods/84f50d98-6178-44d4-8ac4-43a8df4e3339/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.385179 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85565888-6622-4dfc-9198-8e9c5b05cc75" path="/var/lib/kubelet/pods/85565888-6622-4dfc-9198-8e9c5b05cc75/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.386090 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e25820-62eb-4ad9-92ad-471c2f0f7ed4" path="/var/lib/kubelet/pods/97e25820-62eb-4ad9-92ad-471c2f0f7ed4/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.387415 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afbf9dd3-3bb5-4908-aad0-d06f09946e17" path="/var/lib/kubelet/pods/afbf9dd3-3bb5-4908-aad0-d06f09946e17/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.388428 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_719faf26-7700-4eff-9dca-0a4ec3c51344/ovsdbserver-sb/0.log" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.388544 4962 generic.go:334] "Generic (PLEG): container finished" podID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerID="a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8" exitCode=2 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.388701 4962 generic.go:334] "Generic (PLEG): container finished" podID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerID="9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712" exitCode=143 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.388895 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d970dac6-1948-42dd-b5d9-c5df1b04e30d" path="/var/lib/kubelet/pods/d970dac6-1948-42dd-b5d9-c5df1b04e30d/volumes" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.392805 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.392857 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mm68z"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6f6vb" event={"ID":"812fea74-e4e5-4550-8a20-8fe04752a016","Type":"ContainerStarted","Data":"b35105a6f1f09300973fb51f5cc2ceed7e4acc42cd81be4a5215ef08b873fcd8"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393055 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a33d-account-create-update-6q8g4"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393096 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerDied","Data":"a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393158 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerDied","Data":"9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393179 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393205 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjq4f"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393222 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393251 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393277 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393292 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393307 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h5ptn"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393337 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393356 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393376 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393391 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.393404 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zfmzb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.394038 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-log" containerID="cri-o://2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.394426 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-httpd" containerID="cri-o://fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.395818 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" containerID="cri-o://7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.395977 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5dfd6b5f7f-dkfsl" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-httpd" containerID="cri-o://731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.395581 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api-log" containerID="cri-o://aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.397973 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5dfd6b5f7f-dkfsl" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-api" containerID="cri-o://45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.401652 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-server" containerID="cri-o://b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.401855 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="swift-recon-cron" containerID="cri-o://63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.401921 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="rsync" containerID="cri-o://3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402017 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-expirer" containerID="cri-o://6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402348 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-auditor" containerID="cri-o://3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402418 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-replicator" containerID="cri-o://05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402077 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-updater" containerID="cri-o://3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.402618 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-server" containerID="cri-o://6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404247 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-auditor" containerID="cri-o://5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404389 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-reaper" containerID="cri-o://1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404473 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-replicator" containerID="cri-o://066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404611 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-auditor" containerID="cri-o://4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404669 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-updater" containerID="cri-o://87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.404732 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-replicator" containerID="cri-o://8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.439113 4962 generic.go:334] "Generic (PLEG): container finished" podID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerID="337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28" exitCode=0 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.439269 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerDied","Data":"337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.458628 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-server" containerID="cri-o://138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.460302 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.460349 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:13.460331583 +0000 UTC m=+1385.042803419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.475295 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" containerID="cri-o://fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" gracePeriod=29 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.475575 4962 generic.go:334] "Generic (PLEG): container finished" podID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerID="4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" exitCode=143 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.475648 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerDied","Data":"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53"} Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.512416 4962 generic.go:334] "Generic (PLEG): container finished" podID="755ca463-8c62-402c-8a88-a066fb38b521" containerID="58314faa8bcfe5f5f7afbcc99e392370d5f2737c5567814db10eda41512d6621" exitCode=137 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.515842 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d8b3-account-create-update-br2xj"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.537658 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.546000 4962 generic.go:334] "Generic (PLEG): container finished" podID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerID="d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8" exitCode=0 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.546483 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6" event={"ID":"383d4f1e-72b3-48ce-9427-0361c19e41fc","Type":"ContainerDied","Data":"d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8"} Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.578152 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.578228 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:12.078204816 +0000 UTC m=+1383.660676662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.585984 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2m8r7"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.588819 4962 generic.go:334] "Generic (PLEG): container finished" podID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerID="e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed" exitCode=2 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.588875 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerDied","Data":"e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed"} Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.623440 4962 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 20 10:18:11 crc kubenswrapper[4962]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 10:18:11 crc kubenswrapper[4962]: + source /usr/local/bin/container-scripts/functions Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNBridge=br-int Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNRemote=tcp:localhost:6642 Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNEncapType=geneve Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNAvailabilityZones= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ EnableChassisAsGateway=true Feb 20 10:18:11 crc kubenswrapper[4962]: ++ PhysicalNetworks= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNHostName= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 10:18:11 crc kubenswrapper[4962]: ++ ovs_dir=/var/lib/openvswitch Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 10:18:11 crc kubenswrapper[4962]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + cleanup_ovsdb_server_semaphore Feb 20 10:18:11 crc kubenswrapper[4962]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 10:18:11 crc kubenswrapper[4962]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-r7g9h" message=< Feb 20 10:18:11 crc kubenswrapper[4962]: Exiting ovsdb-server (5) [ OK ] Feb 20 10:18:11 crc kubenswrapper[4962]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 10:18:11 crc kubenswrapper[4962]: + source /usr/local/bin/container-scripts/functions Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNBridge=br-int Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNRemote=tcp:localhost:6642 Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNEncapType=geneve Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNAvailabilityZones= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ EnableChassisAsGateway=true Feb 20 10:18:11 crc kubenswrapper[4962]: ++ PhysicalNetworks= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNHostName= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 10:18:11 crc kubenswrapper[4962]: ++ ovs_dir=/var/lib/openvswitch Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 10:18:11 crc kubenswrapper[4962]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + cleanup_ovsdb_server_semaphore Feb 20 10:18:11 crc kubenswrapper[4962]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 10:18:11 crc kubenswrapper[4962]: > Feb 20 10:18:11 crc kubenswrapper[4962]: E0220 10:18:11.623503 4962 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 20 10:18:11 crc kubenswrapper[4962]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 10:18:11 crc kubenswrapper[4962]: + source /usr/local/bin/container-scripts/functions Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNBridge=br-int Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNRemote=tcp:localhost:6642 Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNEncapType=geneve Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNAvailabilityZones= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ EnableChassisAsGateway=true Feb 20 10:18:11 crc kubenswrapper[4962]: ++ PhysicalNetworks= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ OVNHostName= Feb 20 10:18:11 crc kubenswrapper[4962]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 10:18:11 crc kubenswrapper[4962]: ++ ovs_dir=/var/lib/openvswitch Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 10:18:11 crc kubenswrapper[4962]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 10:18:11 crc kubenswrapper[4962]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + sleep 0.5 Feb 20 10:18:11 crc kubenswrapper[4962]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 10:18:11 crc kubenswrapper[4962]: + cleanup_ovsdb_server_semaphore Feb 20 10:18:11 crc kubenswrapper[4962]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 10:18:11 crc kubenswrapper[4962]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 10:18:11 crc kubenswrapper[4962]: > pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" containerID="cri-o://0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.623542 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" containerID="cri-o://0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" gracePeriod=28 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.628889 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-k7csj_88c21489-524e-4ee7-a340-5be2573af161/openstack-network-exporter/0.log" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.628944 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.638434 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.645579 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-httpd" containerID="cri-o://f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.646136 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-log" containerID="cri-o://c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.699484 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.699769 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b8479d945-8wsh9" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker-log" containerID="cri-o://1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.700194 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b8479d945-8wsh9" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker" containerID="cri-o://d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.704184 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.718712 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.725841 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4hwp2"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.731495 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.731758 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener-log" containerID="cri-o://6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.731880 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener" containerID="cri-o://5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.745291 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.745675 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" containerID="cri-o://d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.745862 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" containerID="cri-o://42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.753622 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.767061 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.767301 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84464996cb-fhnvz" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" containerID="cri-o://a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.768284 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84464996cb-fhnvz" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" containerID="cri-o://e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.774215 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783207 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783382 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783477 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783499 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783555 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.783675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") pod \"88c21489-524e-4ee7-a340-5be2573af161\" (UID: \"88c21489-524e-4ee7-a340-5be2573af161\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.784692 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.785012 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" containerID="cri-o://fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.785627 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" containerID="cri-o://2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.786678 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.788135 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config" (OuterVolumeSpecName: "config") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.788202 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.798563 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.808933 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-758kd"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.823102 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.834839 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.854177 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm" (OuterVolumeSpecName: "kube-api-access-wxgrm") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "kube-api-access-wxgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888645 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888717 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888794 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888865 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.888974 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.889027 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") pod \"383d4f1e-72b3-48ce-9427-0361c19e41fc\" (UID: \"383d4f1e-72b3-48ce-9427-0361c19e41fc\") " Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.893095 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.893137 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894188 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run" (OuterVolumeSpecName: "var-run") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894350 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c21489-524e-4ee7-a340-5be2573af161-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894362 4962 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894374 4962 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894385 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxgrm\" (UniqueName: \"kubernetes.io/projected/88c21489-524e-4ee7-a340-5be2573af161-kube-api-access-wxgrm\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894396 4962 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894406 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/88c21489-524e-4ee7-a340-5be2573af161-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.894415 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/383d4f1e-72b3-48ce-9427-0361c19e41fc-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.895146 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts" (OuterVolumeSpecName: "scripts") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.906185 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9" (OuterVolumeSpecName: "kube-api-access-l52h9") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "kube-api-access-l52h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.929743 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.939329 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tbn8g"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.944263 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.948758 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.957955 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-xnwmz"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.966032 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.966863 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83" gracePeriod=30 Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.980387 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.996922 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9xxwl"] Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.998519 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l52h9\" (UniqueName: \"kubernetes.io/projected/383d4f1e-72b3-48ce-9427-0361c19e41fc-kube-api-access-l52h9\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.998549 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:11 crc kubenswrapper[4962]: I0220 10:18:11.998559 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/383d4f1e-72b3-48ce-9427-0361c19e41fc-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.008723 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.009019 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" containerID="cri-o://24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" gracePeriod=30 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.018574 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "383d4f1e-72b3-48ce-9427-0361c19e41fc" (UID: "383d4f1e-72b3-48ce-9427-0361c19e41fc"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.018665 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.052896 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.072233 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" containerID="cri-o://89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" gracePeriod=604800 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.108360 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/383d4f1e-72b3-48ce-9427-0361c19e41fc-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.108397 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.108674 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.108738 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:13.108719175 +0000 UTC m=+1384.691191011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.148820 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.164438 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "88c21489-524e-4ee7-a340-5be2573af161" (UID: "88c21489-524e-4ee7-a340-5be2573af161"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.212833 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/88c21489-524e-4ee7-a340-5be2573af161-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.262132 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.278801 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="galera" containerID="cri-o://a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6" gracePeriod=30 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.314766 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.314891 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.315041 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.315182 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.340047 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd" (OuterVolumeSpecName: "kube-api-access-qdngd") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "kube-api-access-qdngd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.417355 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.417642 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.418024 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.418274 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") pod \"755ca463-8c62-402c-8a88-a066fb38b521\" (UID: \"755ca463-8c62-402c-8a88-a066fb38b521\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.418946 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.419569 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.419815 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.423335 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") pod \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\" (UID: \"2e4f70a2-b8ae-48cc-a098-5642fad8b040\") " Feb 20 10:18:12 crc kubenswrapper[4962]: W0220 10:18:12.418881 4962 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/755ca463-8c62-402c-8a88-a066fb38b521/volumes/kubernetes.io~configmap/openstack-config Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.428836 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.432446 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdngd\" (UniqueName: \"kubernetes.io/projected/755ca463-8c62-402c-8a88-a066fb38b521-kube-api-access-qdngd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.432475 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.422991 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.440968 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" containerID="cri-o://2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" gracePeriod=30 Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.491168 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.502718 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.506292 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.506464 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.509921 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.513824 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l" (OuterVolumeSpecName: "kube-api-access-nss8l") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "kube-api-access-nss8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.544199 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.544238 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nss8l\" (UniqueName: \"kubernetes.io/projected/2e4f70a2-b8ae-48cc-a098-5642fad8b040-kube-api-access-nss8l\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.549017 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.562578 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ct4qz"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.567397 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.577098 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbq67"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.587229 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.587496 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" containerID="cri-o://5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" gracePeriod=30 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614799 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614831 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614839 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614847 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614855 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614862 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614869 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614877 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614883 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614890 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614897 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614903 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614909 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614916 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614963 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.614990 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615002 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615011 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615020 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615028 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615036 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615045 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615053 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615070 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615089 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.615100 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.616899 4962 generic.go:334] "Generic (PLEG): container finished" podID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerID="2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.616944 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerDied","Data":"2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.618067 4962 generic.go:334] "Generic (PLEG): container finished" podID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerID="6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.618101 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerDied","Data":"6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.663159 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.663281 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerDied","Data":"0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.710879 4962 generic.go:334] "Generic (PLEG): container finished" podID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerID="a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.710988 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerDied","Data":"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.728870 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "755ca463-8c62-402c-8a88-a066fb38b521" (UID: "755ca463-8c62-402c-8a88-a066fb38b521"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.730176 4962 generic.go:334] "Generic (PLEG): container finished" podID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerID="c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.730232 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerDied","Data":"c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.731672 4962 generic.go:334] "Generic (PLEG): container finished" podID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerID="731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.731709 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerDied","Data":"731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.733248 4962 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/755ca463-8c62-402c-8a88-a066fb38b521-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.752286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" event={"ID":"2e4f70a2-b8ae-48cc-a098-5642fad8b040","Type":"ContainerDied","Data":"43990fb41b7e9e93f7abc5c81e14ddd0fcd4df0bd08b0a99fd55dd59749b0c05"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.752344 4962 scope.go:117] "RemoveContainer" containerID="337cc3322a86ac4051b60ce8c7418dd0f1ccf4eafea40f3e9c75cc1f12e67b28" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.752476 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-hl7mw" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.789241 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config" (OuterVolumeSpecName: "config") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.789426 4962 generic.go:334] "Generic (PLEG): container finished" podID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerID="d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.789489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerDied","Data":"d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.821465 4962 generic.go:334] "Generic (PLEG): container finished" podID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerID="aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.821573 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerDied","Data":"aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.838205 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.840318 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.840332 4962 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.846522 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerID="c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.846579 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerDied","Data":"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.855818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.865011 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.879082 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_719faf26-7700-4eff-9dca-0a4ec3c51344/ovsdbserver-sb/0.log" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.879371 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.881916 4962 generic.go:334] "Generic (PLEG): container finished" podID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerID="b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48" exitCode=0 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.882024 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerDied","Data":"b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.900612 4962 scope.go:117] "RemoveContainer" containerID="b6772b9162a6a32cfbe3b48349f45c3e39e34e153494f4b09b124b0a0f86db0c" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.901318 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.909243 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f35bada-015d-4051-9976-d5dfe3a93216" containerID="1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.909329 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerDied","Data":"1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.912260 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k7csj" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.912400 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k7csj" event={"ID":"88c21489-524e-4ee7-a340-5be2573af161","Type":"ContainerDied","Data":"f6763fa902e28879cc4359d1b1acc4ff238f733e4bd8236ae411565bdfb3ac57"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.915353 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e4f70a2-b8ae-48cc-a098-5642fad8b040" (UID: "2e4f70a2-b8ae-48cc-a098-5642fad8b040"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.935679 4962 generic.go:334] "Generic (PLEG): container finished" podID="ca793428-98ed-4f82-aa57-31d6671d546c" containerID="fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89" exitCode=143 Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.935809 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerDied","Data":"fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.946773 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.947804 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wj9f6" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.948070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.948142 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.948306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949665 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949714 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949745 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949939 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.949973 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") pod \"719faf26-7700-4eff-9dca-0a4ec3c51344\" (UID: \"719faf26-7700-4eff-9dca-0a4ec3c51344\") " Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.951150 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.948536 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wj9f6" event={"ID":"383d4f1e-72b3-48ce-9427-0361c19e41fc","Type":"ContainerDied","Data":"1ed5bd754fe42b78759f03224b6a39f1b92d8d484574e9a6557ab622debe2a23"} Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.953972 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.955719 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.955755 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4f70a2-b8ae-48cc-a098-5642fad8b040-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.958539 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts" (OuterVolumeSpecName: "scripts") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.969567 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config" (OuterVolumeSpecName: "config") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.972519 4962 scope.go:117] "RemoveContainer" containerID="c9e1c05611f8961e024087e0e04491e46e765acba8a5cc8a2a36a27876de28c3" Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.973998 4962 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 10:18:12 crc kubenswrapper[4962]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: if [ -n "barbican" ]; then Feb 20 10:18:12 crc kubenswrapper[4962]: GRANT_DATABASE="barbican" Feb 20 10:18:12 crc kubenswrapper[4962]: else Feb 20 10:18:12 crc kubenswrapper[4962]: GRANT_DATABASE="*" Feb 20 10:18:12 crc kubenswrapper[4962]: fi Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: # going for maximum compatibility here: Feb 20 10:18:12 crc kubenswrapper[4962]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 10:18:12 crc kubenswrapper[4962]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 10:18:12 crc kubenswrapper[4962]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 10:18:12 crc kubenswrapper[4962]: # support updates Feb 20 10:18:12 crc kubenswrapper[4962]: Feb 20 10:18:12 crc kubenswrapper[4962]: $MYSQL_CMD < logger="UnhandledError" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.974285 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 10:18:12 crc kubenswrapper[4962]: E0220 10:18:12.977011 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-c46d-account-create-update-gfqts" podUID="cca18a27-31bc-440b-a4a9-517b3323bb91" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.991861 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9" (OuterVolumeSpecName: "kube-api-access-qsrg9") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "kube-api-access-qsrg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:12 crc kubenswrapper[4962]: I0220 10:18:12.992052 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.040010 4962 scope.go:117] "RemoveContainer" containerID="d6952143bea0c9abcddc4768b2bd10fcf02f0a555e5cd8d1c565a371744060b8" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.061826 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062649 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062704 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062734 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062783 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062856 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062895 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.062955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"801fa82d-0f57-4af2-9eec-b6cddac658ab\" (UID: \"801fa82d-0f57-4af2-9eec-b6cddac658ab\") " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063700 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrg9\" (UniqueName: \"kubernetes.io/projected/719faf26-7700-4eff-9dca-0a4ec3c51344-kube-api-access-qsrg9\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063717 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063727 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063737 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719faf26-7700-4eff-9dca-0a4ec3c51344-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.063757 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.074632 4962 scope.go:117] "RemoveContainer" containerID="58314faa8bcfe5f5f7afbcc99e392370d5f2737c5567814db10eda41512d6621" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.084836 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts" (OuterVolumeSpecName: "scripts") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.085433 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.095667 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.085519 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config" (OuterVolumeSpecName: "config") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.112063 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wj9f6"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.115399 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.132286 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882" (OuterVolumeSpecName: "kube-api-access-bd882") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "kube-api-access-bd882". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.166587 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd882\" (UniqueName: \"kubernetes.io/projected/801fa82d-0f57-4af2-9eec-b6cddac658ab-kube-api-access-bd882\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.166629 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.166639 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.166805 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/801fa82d-0f57-4af2-9eec-b6cddac658ab-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.166905 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.166985 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:15.166963816 +0000 UTC m=+1386.749435662 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.174888 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.191277 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0275d40a-1206-4eb2-96c8-6c516c57bed7" path="/var/lib/kubelet/pods/0275d40a-1206-4eb2-96c8-6c516c57bed7/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.191935 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032f830f-9636-4783-a048-00f9b7b22a3a" path="/var/lib/kubelet/pods/032f830f-9636-4783-a048-00f9b7b22a3a/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.192939 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d" path="/var/lib/kubelet/pods/05fbddf6-ff08-4125-bfea-1a3e2a4b8a5d/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.201860 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f853840-0af1-40ee-b11b-a0a62f9f4ebf" path="/var/lib/kubelet/pods/1f853840-0af1-40ee-b11b-a0a62f9f4ebf/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.202521 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20663c25-09a7-4a31-9994-450f507d4ff1" path="/var/lib/kubelet/pods/20663c25-09a7-4a31-9994-450f507d4ff1/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.203117 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28bfacb3-7247-41ad-bf30-47c81427487b" path="/var/lib/kubelet/pods/28bfacb3-7247-41ad-bf30-47c81427487b/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.204248 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b915fcc-cf15-43c3-97c6-bde3a29da796" path="/var/lib/kubelet/pods/2b915fcc-cf15-43c3-97c6-bde3a29da796/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.204788 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" path="/var/lib/kubelet/pods/383d4f1e-72b3-48ce-9427-0361c19e41fc/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.205421 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a7b81e-d4af-478f-b2c3-d21f117ad7ec" path="/var/lib/kubelet/pods/39a7b81e-d4af-478f-b2c3-d21f117ad7ec/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.206433 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684fc9d7-94f0-418a-b059-e5519e6cd316" path="/var/lib/kubelet/pods/684fc9d7-94f0-418a-b059-e5519e6cd316/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.207094 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755ca463-8c62-402c-8a88-a066fb38b521" path="/var/lib/kubelet/pods/755ca463-8c62-402c-8a88-a066fb38b521/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.207752 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79394db3-1fa2-4b8f-927a-1cf8085f1df4" path="/var/lib/kubelet/pods/79394db3-1fa2-4b8f-927a-1cf8085f1df4/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.208565 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7420bd-d4ef-4511-acf4-a132ad0a5677" path="/var/lib/kubelet/pods/7c7420bd-d4ef-4511-acf4-a132ad0a5677/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.211144 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da93993-8b14-45f6-8d0b-8366becc762e" path="/var/lib/kubelet/pods/7da93993-8b14-45f6-8d0b-8366becc762e/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.211857 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e" path="/var/lib/kubelet/pods/a30d4c4f-6cfe-4f24-9ee1-d2285edfad3e/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.212566 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e761565e-55de-43bc-b82d-95b776652b5c" path="/var/lib/kubelet/pods/e761565e-55de-43bc-b82d-95b776652b5c/volumes" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.269105 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-k7csj"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.269195 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.269213 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-hl7mw"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.272219 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.321721 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" containerID="cri-o://f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" gracePeriod=604800 Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.353102 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.385371 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.416722 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.491311 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.492992 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.493023 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.493094 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.493146 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:17.493128328 +0000 UTC m=+1389.075600174 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.530939 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.540896 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.572782 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.596211 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.596255 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.596267 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.642852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "801fa82d-0f57-4af2-9eec-b6cddac658ab" (UID: "801fa82d-0f57-4af2-9eec-b6cddac658ab"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.693727 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "719faf26-7700-4eff-9dca-0a4ec3c51344" (UID: "719faf26-7700-4eff-9dca-0a4ec3c51344"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.721110 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/719faf26-7700-4eff-9dca-0a4ec3c51344-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.721139 4962 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/801fa82d-0f57-4af2-9eec-b6cddac658ab-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.865154 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.880611 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.888123 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.888417 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5b685f5b9-4db6w" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-httpd" containerID="cri-o://68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9" gracePeriod=30 Feb 20 10:18:13 crc kubenswrapper[4962]: I0220 10:18:13.888908 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5b685f5b9-4db6w" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-server" containerID="cri-o://42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632" gracePeriod=30 Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.904809 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 10:18:13 crc kubenswrapper[4962]: E0220 10:18:13.904890 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.000667 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod812fea74_e4e5_4550_8a20_8fe04752a016.slice/crio-93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f35bada_015d_4051_9976_d5dfe3a93216.slice/crio-d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod812fea74_e4e5_4550_8a20_8fe04752a016.slice/crio-conmon-93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f35bada_015d_4051_9976_d5dfe3a93216.slice/crio-conmon-d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dd889b7_1b72_4e57_ad0f_85facbad8da4.slice/crio-a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.001196 4962 generic.go:334] "Generic (PLEG): container finished" podID="7f35bada-015d-4051-9976-d5dfe3a93216" containerID="d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.001265 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerDied","Data":"d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.002962 4962 generic.go:334] "Generic (PLEG): container finished" podID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerID="aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.003012 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00","Type":"ContainerDied","Data":"aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.007626 4962 generic.go:334] "Generic (PLEG): container finished" podID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerID="5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.007685 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerDied","Data":"5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.022529 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_719faf26-7700-4eff-9dca-0a4ec3c51344/ovsdbserver-sb/0.log" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.022647 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"719faf26-7700-4eff-9dca-0a4ec3c51344","Type":"ContainerDied","Data":"a2d2a8a63bf5c9ebd610b16b09ca46a05d03ae717f57b9ce876334d685870041"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.022717 4962 scope.go:117] "RemoveContainer" containerID="a2580fff2ba1ecc29418d1a47b14ce5d8459c470e24eee4d2ebced1a648dc3a8" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.022983 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.050315 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.053181 4962 generic.go:334] "Generic (PLEG): container finished" podID="812fea74-e4e5-4550-8a20-8fe04752a016" containerID="93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1" exitCode=1 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.053262 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6f6vb" event={"ID":"812fea74-e4e5-4550-8a20-8fe04752a016","Type":"ContainerDied","Data":"93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.054124 4962 scope.go:117] "RemoveContainer" containerID="93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.084824 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-gfqts" event={"ID":"cca18a27-31bc-440b-a4a9-517b3323bb91","Type":"ContainerStarted","Data":"f6015af78b401355bf39302e0c5756af3b69a15cfa67686aab9f59e8e5466d2c"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.104900 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.108566 4962 generic.go:334] "Generic (PLEG): container finished" podID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerID="f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.108726 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerDied","Data":"f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.126139 4962 scope.go:117] "RemoveContainer" containerID="9a823554a8f72450a8956f74b11a494798fb5f7fc99300ed38421760066cc712" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.126437 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"801fa82d-0f57-4af2-9eec-b6cddac658ab","Type":"ContainerDied","Data":"2226a3425cb913ac33dc3114a16db2100facfc7423dff93548d53775b718e6e2"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.126546 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.137785 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.137826 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.137956 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.139280 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.139420 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") pod \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\" (UID: \"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.160010 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn" (OuterVolumeSpecName: "kube-api-access-bznfn") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "kube-api-access-bznfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.162329 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.188114 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.189754 4962 generic.go:334] "Generic (PLEG): container finished" podID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerID="a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6" exitCode=0 Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.190234 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerDied","Data":"a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6"} Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.224128 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data" (OuterVolumeSpecName: "config-data") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.259417 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.259474 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.259488 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bznfn\" (UniqueName: \"kubernetes.io/projected/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-kube-api-access-bznfn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.267569 4962 scope.go:117] "RemoveContainer" containerID="e9de55d709a0309b4fcbcb74a44dfc77cc45f95d7066591c4a40dc2b0ceb9eed" Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.267624 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.267797 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.271449 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.271660 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.273112 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.273149 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.273167 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:14 crc kubenswrapper[4962]: E0220 10:18:14.273233 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.310171 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.315465 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.317613 4962 scope.go:117] "RemoveContainer" containerID="b7ff4938197d4ffeb1d0dead4cb76392b4c2fbfcd796b8766f3dbd1e8efbaf48" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.318579 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.332517 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.358582 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.386382 4962 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.429390 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" (UID: "fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.440606 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-687f4cff74-gmh4w" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.168:8778/\": read tcp 10.217.0.2:35648->10.217.0.168:8778: read: connection reset by peer" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.440606 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-687f4cff74-gmh4w" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.168:8778/\": read tcp 10.217.0.2:35640->10.217.0.168:8778: read: connection reset by peer" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488170 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488361 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488424 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488519 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488555 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488643 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488711 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488745 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488784 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") pod \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\" (UID: \"28437fcd-377a-4b9e-9a28-e01c21e2ad1f\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488823 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.488848 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") pod \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\" (UID: \"c90d5126-d89a-42e6-9b7d-bfc53475bc56\") " Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.489294 4962 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.491985 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.492809 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs" (OuterVolumeSpecName: "logs") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.521564 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts" (OuterVolumeSpecName: "scripts") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.523771 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.530953 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7" (OuterVolumeSpecName: "kube-api-access-v96c7") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "kube-api-access-v96c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.536809 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k" (OuterVolumeSpecName: "kube-api-access-5494k") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "kube-api-access-5494k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.539234 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.576565 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.590926 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593073 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593095 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593106 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593114 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c90d5126-d89a-42e6-9b7d-bfc53475bc56-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593123 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v96c7\" (UniqueName: \"kubernetes.io/projected/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-kube-api-access-v96c7\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593132 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593141 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.593153 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5494k\" (UniqueName: \"kubernetes.io/projected/c90d5126-d89a-42e6-9b7d-bfc53475bc56-kube-api-access-5494k\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.652821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.687008 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data" (OuterVolumeSpecName: "config-data") pod "28437fcd-377a-4b9e-9a28-e01c21e2ad1f" (UID: "28437fcd-377a-4b9e-9a28-e01c21e2ad1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.695809 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.695830 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28437fcd-377a-4b9e-9a28-e01c21e2ad1f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.724854 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data" (OuterVolumeSpecName: "config-data") pod "c90d5126-d89a-42e6-9b7d-bfc53475bc56" (UID: "c90d5126-d89a-42e6-9b7d-bfc53475bc56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.803162 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c90d5126-d89a-42e6-9b7d-bfc53475bc56-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.869542 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.872930 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.891890 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:14 crc kubenswrapper[4962]: I0220 10:18:14.900866 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.003803 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.165:8776/healthcheck\": read tcp 10.217.0.2:54310->10.217.0.165:8776: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008294 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") pod \"cca18a27-31bc-440b-a4a9-517b3323bb91\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008410 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008468 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008550 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008769 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008836 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008880 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008935 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008967 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.008999 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009015 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009086 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009116 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009140 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009174 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009202 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") pod \"cca18a27-31bc-440b-a4a9-517b3323bb91\" (UID: \"cca18a27-31bc-440b-a4a9-517b3323bb91\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009237 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009265 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009298 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") pod \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\" (UID: \"8dd889b7-1b72-4e57-ad0f-85facbad8da4\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009340 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") pod \"4a879cb3-19b4-4767-8640-993cc47dc7ed\" (UID: \"4a879cb3-19b4-4767-8640-993cc47dc7ed\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.009357 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") pod \"7f35bada-015d-4051-9976-d5dfe3a93216\" (UID: \"7f35bada-015d-4051-9976-d5dfe3a93216\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.014108 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs" (OuterVolumeSpecName: "logs") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.014168 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.014265 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs" (OuterVolumeSpecName: "logs") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.017907 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cca18a27-31bc-440b-a4a9-517b3323bb91" (UID: "cca18a27-31bc-440b-a4a9-517b3323bb91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.020927 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.024130 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts" (OuterVolumeSpecName: "scripts") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.025163 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.028020 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl" (OuterVolumeSpecName: "kube-api-access-s9xjl") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "kube-api-access-s9xjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.036223 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.037184 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.037455 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz" (OuterVolumeSpecName: "kube-api-access-6mbmz") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "kube-api-access-6mbmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.042434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.044446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx" (OuterVolumeSpecName: "kube-api-access-ps9mx") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "kube-api-access-ps9mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.050882 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt" (OuterVolumeSpecName: "kube-api-access-5cwxt") pod "cca18a27-31bc-440b-a4a9-517b3323bb91" (UID: "cca18a27-31bc-440b-a4a9-517b3323bb91"). InnerVolumeSpecName "kube-api-access-5cwxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.061231 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:58810->10.217.0.206:8775: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.061403 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:58814->10.217.0.206:8775: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.111112 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113237 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113264 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113307 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f35bada-015d-4051-9976-d5dfe3a93216-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113320 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113330 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9xjl\" (UniqueName: \"kubernetes.io/projected/4a879cb3-19b4-4767-8640-993cc47dc7ed-kube-api-access-s9xjl\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113338 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a879cb3-19b4-4767-8640-993cc47dc7ed-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113347 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113355 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113363 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113372 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113380 4962 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113388 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cwxt\" (UniqueName: \"kubernetes.io/projected/cca18a27-31bc-440b-a4a9-517b3323bb91-kube-api-access-5cwxt\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.113397 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps9mx\" (UniqueName: \"kubernetes.io/projected/8dd889b7-1b72-4e57-ad0f-85facbad8da4-kube-api-access-ps9mx\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.114019 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mbmz\" (UniqueName: \"kubernetes.io/projected/7f35bada-015d-4051-9976-d5dfe3a93216-kube-api-access-6mbmz\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.114036 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca18a27-31bc-440b-a4a9-517b3323bb91-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.166132 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" path="/var/lib/kubelet/pods/2e4f70a2-b8ae-48cc-a098-5642fad8b040/volumes" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.171863 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" path="/var/lib/kubelet/pods/719faf26-7700-4eff-9dca-0a4ec3c51344/volumes" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.172672 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" path="/var/lib/kubelet/pods/801fa82d-0f57-4af2-9eec-b6cddac658ab/volumes" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.174088 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c21489-524e-4ee7-a340-5be2573af161" path="/var/lib/kubelet/pods/88c21489-524e-4ee7-a340-5be2573af161/volumes" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.188882 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data" (OuterVolumeSpecName: "config-data") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.207459 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.211726 4962 generic.go:334] "Generic (PLEG): container finished" podID="812fea74-e4e5-4550-8a20-8fe04752a016" containerID="156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6" exitCode=1 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.212515 4962 scope.go:117] "RemoveContainer" containerID="156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6" Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.213070 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-6f6vb_openstack(812fea74-e4e5-4550-8a20-8fe04752a016)\"" pod="openstack/root-account-create-update-6f6vb" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.213887 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.218394 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.218421 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.218431 4962 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.218496 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.218536 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:19.218519641 +0000 UTC m=+1390.800991487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.240035 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6f6vb" event={"ID":"812fea74-e4e5-4550-8a20-8fe04752a016","Type":"ContainerDied","Data":"156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.240103 4962 scope.go:117] "RemoveContainer" containerID="93e28da688e5c7a59f6cefecb59748d6ded79a8223cb25928a3b22790ce93bd1" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.242822 4962 generic.go:334] "Generic (PLEG): container finished" podID="ca793428-98ed-4f82-aa57-31d6671d546c" containerID="2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.242938 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerDied","Data":"2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.269253 4962 generic.go:334] "Generic (PLEG): container finished" podID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerID="7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.269332 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerDied","Data":"7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.280231 4962 generic.go:334] "Generic (PLEG): container finished" podID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerID="fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.280307 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerDied","Data":"fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.288060 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dd889b7-1b72-4e57-ad0f-85facbad8da4" (UID: "8dd889b7-1b72-4e57-ad0f-85facbad8da4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.288884 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.302582 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8479d945-8wsh9" event={"ID":"7f35bada-015d-4051-9976-d5dfe3a93216","Type":"ContainerDied","Data":"a34d63171eeb032e506f3c3f6390187d10864d694aff1bd3157c782304896d3f"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.302785 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8479d945-8wsh9" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.308859 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data" (OuterVolumeSpecName: "config-data") pod "7f35bada-015d-4051-9976-d5dfe3a93216" (UID: "7f35bada-015d-4051-9976-d5dfe3a93216"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.328655 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd889b7-1b72-4e57-ad0f-85facbad8da4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.328689 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35bada-015d-4051-9976-d5dfe3a93216-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.328699 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.330084 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.335799 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00","Type":"ContainerDied","Data":"33d56931989951f09c69fe90d6e65d85c8e97ea86a78f0d42f65def6270a08a7"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.336003 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.368650 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.368880 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c90d5126-d89a-42e6-9b7d-bfc53475bc56","Type":"ContainerDied","Data":"b094901d04b6844ae7ff61500f6dbd375cab8bf6c8a00346003f62e1a980cada"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.369002 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.372772 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.375084 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84464996cb-fhnvz" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:58282->10.217.0.164:9311: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.375196 4962 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84464996cb-fhnvz" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:58270->10.217.0.164:9311: read: connection reset by peer" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.376454 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a879cb3-19b4-4767-8640-993cc47dc7ed" (UID: "4a879cb3-19b4-4767-8640-993cc47dc7ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.380300 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.392504 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" event={"ID":"28437fcd-377a-4b9e-9a28-e01c21e2ad1f","Type":"ContainerDied","Data":"0bed354fd9a98e89b5d38e5675524156eb0b61c69b251716c3b22a1d0bef6443"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.392635 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-569d5979d6-xzr2q" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.397182 4962 generic.go:334] "Generic (PLEG): container finished" podID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerID="eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.397252 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerDied","Data":"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.397273 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687f4cff74-gmh4w" event={"ID":"4a879cb3-19b4-4767-8640-993cc47dc7ed","Type":"ContainerDied","Data":"25d2258a03970a75594e5384f741d4a8aaad9e37d3b0b7c512e80fa795dc3283"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.397326 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687f4cff74-gmh4w" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.413373 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8dd889b7-1b72-4e57-ad0f-85facbad8da4","Type":"ContainerDied","Data":"d0a92b505f163c98c2579b38133407e2587dcd82e4a7d6302d1e3ca2e2112d68"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.413489 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.427547 4962 scope.go:117] "RemoveContainer" containerID="d1cb3b1837bc14d4bc8b54604fff4b13e755f0b6500bf206c46f6f5569e5c26a" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.428498 4962 generic.go:334] "Generic (PLEG): container finished" podID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerID="42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.428523 4962 generic.go:334] "Generic (PLEG): container finished" podID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerID="68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9" exitCode=0 Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.428565 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerDied","Data":"42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.428611 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerDied","Data":"68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.430499 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.430524 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a879cb3-19b4-4767-8640-993cc47dc7ed-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.431931 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c46d-account-create-update-gfqts" event={"ID":"cca18a27-31bc-440b-a4a9-517b3323bb91","Type":"ContainerDied","Data":"f6015af78b401355bf39302e0c5756af3b69a15cfa67686aab9f59e8e5466d2c"} Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.431990 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c46d-account-create-update-gfqts" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.532639 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.549812 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.555907 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-569d5979d6-xzr2q"] Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.557897 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.569063 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.581715 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.586922 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:15 crc kubenswrapper[4962]: E0220 10:18:15.586983 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.587226 4962 scope.go:117] "RemoveContainer" containerID="1a117c325a572e0a4fee70e6f72cca84b0d93bdf09ce042ac50994ca64fd3520" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.590974 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.630686 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635470 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635544 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635701 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635857 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.635977 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.636010 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.636070 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.636662 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") pod \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\" (UID: \"559addbd-1bc6-4146-9a27-ce3e1d3d08fd\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.636916 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.638518 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.639008 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.640019 4962 scope.go:117] "RemoveContainer" containerID="aa9b7812a81805d3c1d048e75378c2e89e7f075bbe36af5665b4416075da7b83" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.654364 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c46d-account-create-update-gfqts"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.674420 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.682549 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.716847 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb" (OuterVolumeSpecName: "kube-api-access-xsclb") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "kube-api-access-xsclb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.723902 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-687f4cff74-gmh4w"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.739971 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.742516 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.742541 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.742554 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsclb\" (UniqueName: \"kubernetes.io/projected/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-kube-api-access-xsclb\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.753783 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.758264 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.771866 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.776784 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6b8479d945-8wsh9"] Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.825538 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data" (OuterVolumeSpecName: "config-data") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.827331 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.832461 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "559addbd-1bc6-4146-9a27-ce3e1d3d08fd" (UID: "559addbd-1bc6-4146-9a27-ce3e1d3d08fd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.845853 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.846727 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.846759 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.846771 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.846781 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/559addbd-1bc6-4146-9a27-ce3e1d3d08fd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.857320 4962 scope.go:117] "RemoveContainer" containerID="c0eb68155798173ab5bc0e3d87fda35f3734305779104c33299016d17b9b3def" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.870517 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.886990 4962 scope.go:117] "RemoveContainer" containerID="f20b981aacdf6de658de3f762f39158362f94f8752f0a75fc0ae9dfa445ad0b1" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.918941 4962 scope.go:117] "RemoveContainer" containerID="5debc339fcb891cc07e7fa0a7db99fb7f297c28473a143743938f4792107d27c" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.947927 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948002 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948046 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948101 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948141 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948213 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948289 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948317 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") pod \"ca793428-98ed-4f82-aa57-31d6671d546c\" (UID: \"ca793428-98ed-4f82-aa57-31d6671d546c\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948477 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.948491 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") pod \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\" (UID: \"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172\") " Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.950498 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs" (OuterVolumeSpecName: "logs") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.951945 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs" (OuterVolumeSpecName: "logs") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.953098 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.961209 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr" (OuterVolumeSpecName: "kube-api-access-gvdkr") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "kube-api-access-gvdkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.978010 4962 scope.go:117] "RemoveContainer" containerID="6cbbafaf6ad06d0f58cf79b2da64a294b16c2b2e6931344860d8ecda539fe7b2" Feb 20 10:18:15 crc kubenswrapper[4962]: I0220 10:18:15.987145 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts" (OuterVolumeSpecName: "scripts") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.041328 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc" (OuterVolumeSpecName: "kube-api-access-hlgvc") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "kube-api-access-hlgvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.047747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.047880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053133 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053165 4962 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053175 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053184 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053193 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053201 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlgvc\" (UniqueName: \"kubernetes.io/projected/ca793428-98ed-4f82-aa57-31d6671d546c-kube-api-access-hlgvc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053209 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca793428-98ed-4f82-aa57-31d6671d546c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.053218 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvdkr\" (UniqueName: \"kubernetes.io/projected/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-kube-api-access-gvdkr\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.082879 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.083447 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-central-agent" containerID="cri-o://cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.083879 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="proxy-httpd" containerID="cri-o://ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.083991 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="sg-core" containerID="cri-o://6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.084084 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-notification-agent" containerID="cri-o://ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.088711 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data" (OuterVolumeSpecName: "config-data") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.156318 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.181796 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.182044 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerName="kube-state-metrics" containerID="cri-o://490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.188756 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.227770 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.228351 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.240111 4962 scope.go:117] "RemoveContainer" containerID="eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.256535 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.256632 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.258027 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.348494 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.352025 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.360172 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.361335 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.365217 4962 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.367779 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data" (OuterVolumeSpecName: "config-data") pod "ca793428-98ed-4f82-aa57-31d6671d546c" (UID: "ca793428-98ed-4f82-aa57-31d6671d546c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.432861 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" (UID: "89dbdc4c-bf31-402e-b5bf-e8bbb8c16172"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476449 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476504 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476601 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476633 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476677 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476770 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476862 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.476911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") pod \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\" (UID: \"ba9a9d46-9ba9-428c-8864-a8db8bca2b57\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.477306 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca793428-98ed-4f82-aa57-31d6671d546c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.477316 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.481351 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.482030 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs" (OuterVolumeSpecName: "logs") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.502308 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.502541 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerName="memcached" containerID="cri-o://2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.503241 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.518986 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c" (OuterVolumeSpecName: "kube-api-access-vqv6c") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "kube-api-access-vqv6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.519255 4962 scope.go:117] "RemoveContainer" containerID="4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.540688 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts" (OuterVolumeSpecName: "scripts") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.550730 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b685f5b9-4db6w" event={"ID":"559addbd-1bc6-4146-9a27-ce3e1d3d08fd","Type":"ContainerDied","Data":"cc47509aa1ca6c26cc469e518128ac8e3dbaf917ad6f17beac89df46710d9f73"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.550873 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b685f5b9-4db6w" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581841 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581869 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581932 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581942 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqv6c\" (UniqueName: \"kubernetes.io/projected/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-kube-api-access-vqv6c\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.581951 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.621402 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.649303 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.657495 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.683933 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.684323 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.698550 4962 generic.go:334] "Generic (PLEG): container finished" podID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerID="42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9" exitCode=0 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.698691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerDied","Data":"42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.698727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"241dc417-3176-4051-ad4e-d98f4f66ddc2","Type":"ContainerDied","Data":"6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.698740 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da4185901fde0f4a19c0acbd71ec3f025dcc7b8a21d60e14e9ba4dfdfa09bbe" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.724815 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.724850 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"89dbdc4c-bf31-402e-b5bf-e8bbb8c16172","Type":"ContainerDied","Data":"6493293a11e7a20494076438c227d47d6ea680b9e8bbd314969ad609945e742d"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.734567 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-125a-account-create-update-bd2q8"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.764264 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765185 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="probe" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765205 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="probe" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765224 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765257 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765270 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765279 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765286 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c21489-524e-4ee7-a340-5be2573af161" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765329 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c21489-524e-4ee7-a340-5be2573af161" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765339 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765346 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765354 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="cinder-scheduler" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765361 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="cinder-scheduler" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765369 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765376 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765405 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-server" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765413 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-server" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765422 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765428 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765439 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765447 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765484 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765491 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765506 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="ovsdbserver-sb" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765512 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="ovsdbserver-sb" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765544 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765551 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765563 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765569 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765576 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="ovsdbserver-nb" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765582 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="ovsdbserver-nb" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765674 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765683 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765694 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765700 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.765735 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.765742 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766102 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="mysql-bootstrap" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766113 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="mysql-bootstrap" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766124 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="init" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766131 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="init" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766142 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766158 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766167 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766173 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766190 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="galera" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766196 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="galera" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766204 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766210 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766220 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766227 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766239 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766246 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener" Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.766255 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="dnsmasq-dns" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766261 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="dnsmasq-dns" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766430 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-metadata" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766444 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c21489-524e-4ee7-a340-5be2573af161" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766454 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" containerName="galera" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766464 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" containerName="nova-metadata-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766476 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766488 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="openstack-network-exporter" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766496 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766505 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766515 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766524 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="801fa82d-0f57-4af2-9eec-b6cddac658ab" containerName="ovsdbserver-nb" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766534 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-server" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766545 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4f70a2-b8ae-48cc-a098-5642fad8b040" containerName="dnsmasq-dns" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766555 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766564 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" containerName="proxy-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766573 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="383d4f1e-72b3-48ce-9427-0361c19e41fc" containerName="ovn-controller" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766583 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766607 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766615 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" containerName="barbican-worker-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766625 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="cinder-scheduler" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766635 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" containerName="barbican-keystone-listener-log" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766641 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="719faf26-7700-4eff-9dca-0a4ec3c51344" containerName="ovsdbserver-sb" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766652 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" containerName="placement-api" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766659 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" containerName="cinder-api" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766666 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" containerName="probe" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.766677 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" containerName="glance-httpd" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.767495 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.774815 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.790612 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.795835 4962 generic.go:334] "Generic (PLEG): container finished" podID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerID="490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4" exitCode=2 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.796204 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cffca43e-3e19-4430-8fe2-ca7cfe6229b0","Type":"ContainerDied","Data":"490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.796421 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.797487 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.818210 4962 scope.go:117] "RemoveContainer" containerID="eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.819255 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data" (OuterVolumeSpecName: "config-data") pod "ba9a9d46-9ba9-428c-8864-a8db8bca2b57" (UID: "ba9a9d46-9ba9-428c-8864-a8db8bca2b57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.819307 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.829729 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.839176 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.851798 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548\": container with ID starting with eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548 not found: ID does not exist" containerID="eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.851842 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548"} err="failed to get container status \"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548\": rpc error: code = NotFound desc = could not find container \"eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548\": container with ID starting with eba325f8f1300c477bc396da76d9efd0fdd96072accf19c1140570ee31c7b548 not found: ID does not exist" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.851871 4962 scope.go:117] "RemoveContainer" containerID="4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.851963 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-m26vd"] Feb 20 10:18:16 crc kubenswrapper[4962]: E0220 10:18:16.853793 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53\": container with ID starting with 4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53 not found: ID does not exist" containerID="4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.853858 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53"} err="failed to get container status \"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53\": rpc error: code = NotFound desc = could not find container \"4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53\": container with ID starting with 4f72b0b6a24968d9eab4cdfe73c03770a2ac626aa75c9f5e5a526fe72f5eea53 not found: ID does not exist" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.853894 4962 scope.go:117] "RemoveContainer" containerID="a0c7e79c3d9e295ee82e5ea9e8238010da77018553646accab9b41ab9dfe22b6" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.865392 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r4hdf"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.887276 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.890838 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892074 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892674 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892748 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892849 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892917 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.892957 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.893084 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") pod \"241dc417-3176-4051-ad4e-d98f4f66ddc2\" (UID: \"241dc417-3176-4051-ad4e-d98f4f66ddc2\") " Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.893415 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.893565 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.896113 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba9a9d46-9ba9-428c-8864-a8db8bca2b57","Type":"ContainerDied","Data":"a7c3f6bf061e2f58df1199abfaabc0fa7edc0079e61af3f51614ef7b77cc0b31"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.896170 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5b685f5b9-4db6w"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.903911 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs" (OuterVolumeSpecName: "logs") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.915573 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba9a9d46-9ba9-428c-8864-a8db8bca2b57-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.915631 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241dc417-3176-4051-ad4e-d98f4f66ddc2-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.916883 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz" (OuterVolumeSpecName: "kube-api-access-2qmjz") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "kube-api-access-2qmjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.934216 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.935183 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.945647 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerID="f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" exitCode=0 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.946080 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerDied","Data":"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.946137 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4f4a409a-4230-42ca-bfcc-f014064cbc6c","Type":"ContainerDied","Data":"256cfc6edb7fdfbe31dd4d739c6bcf21323de33dda20f71407beaea0eb6fd7bc"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.952041 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.952299 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6b4c54c5d9-pqd8r" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerName="keystone-api" containerID="cri-o://57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14" gracePeriod=30 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.967045 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.975355 4962 generic.go:334] "Generic (PLEG): container finished" podID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" exitCode=0 Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.975892 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ce62af15-166f-4f74-a244-2de5147a4b2f","Type":"ContainerDied","Data":"2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592"} Feb 20 10:18:16 crc kubenswrapper[4962]: I0220 10:18:16.994293 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.006961 4962 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-6f6vb" secret="" err="secret \"galera-openstack-dockercfg-898r2\" not found" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.007017 4962 scope.go:117] "RemoveContainer" containerID="156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.007320 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-6f6vb_openstack(812fea74-e4e5-4550-8a20-8fe04752a016)\"" pod="openstack/root-account-create-update-6f6vb" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.008711 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-svsfg"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.012528 4962 scope.go:117] "RemoveContainer" containerID="f6e6a97dcf3e2888aaf774e41bb7caae5d9537602046e6592fd534041d6392a2" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.018787 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.018901 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.018958 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.018982 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019053 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019097 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019232 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019290 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019334 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019360 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019392 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019446 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019478 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") pod \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\" (UID: \"10c1a487-1a74-4994-9b39-f05cbe0fa5c7\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019500 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") pod \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\" (UID: \"4f4a409a-4230-42ca-bfcc-f014064cbc6c\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019799 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019858 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019963 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.019983 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qmjz\" (UniqueName: \"kubernetes.io/projected/241dc417-3176-4051-ad4e-d98f4f66ddc2-kube-api-access-2qmjz\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.020052 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.020110 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:17.52009344 +0000 UTC m=+1389.102565276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.023136 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.023493 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ca793428-98ed-4f82-aa57-31d6671d546c","Type":"ContainerDied","Data":"814b8f37484da31723bf086a4604103ef52cd7ea4f8156d43acda95faab765f4"} Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.023628 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.028832 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np" (OuterVolumeSpecName: "kube-api-access-zw2np") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "kube-api-access-zw2np". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.029308 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.032851 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.033735 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data" (OuterVolumeSpecName: "config-data") pod "241dc417-3176-4051-ad4e-d98f4f66ddc2" (UID: "241dc417-3176-4051-ad4e-d98f4f66ddc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.041194 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs" (OuterVolumeSpecName: "logs") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.041299 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.042122 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-zgjrp operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-125a-account-create-update-rtszm" podUID="0991ff2f-16e5-4891-a38d-8cb9e4b016ec" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.071260 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g" (OuterVolumeSpecName: "kube-api-access-mrb2g") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "kube-api-access-mrb2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.071366 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs" (OuterVolumeSpecName: "logs") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.074169 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.084290 4962 projected.go:194] Error preparing data for projected volume kube-api-access-zgjrp for pod openstack/keystone-125a-account-create-update-rtszm: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.084387 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:17.584360587 +0000 UTC m=+1389.166832433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zgjrp" (UniqueName: "kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.085045 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts" (OuterVolumeSpecName: "scripts") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.094371 4962 generic.go:334] "Generic (PLEG): container finished" podID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerID="e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" exitCode=0 Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.094451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerDied","Data":"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515"} Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.094486 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84464996cb-fhnvz" event={"ID":"10c1a487-1a74-4994-9b39-f05cbe0fa5c7","Type":"ContainerDied","Data":"3a9d85e1ad92d2243530d4e2efdb0f3c712197cf6ab61af23aeb5feca6269a13"} Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.094633 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84464996cb-fhnvz" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.104098 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.127670 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128615 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128645 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128656 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw2np\" (UniqueName: \"kubernetes.io/projected/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-kube-api-access-zw2np\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128666 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrb2g\" (UniqueName: \"kubernetes.io/projected/4f4a409a-4230-42ca-bfcc-f014064cbc6c-kube-api-access-mrb2g\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128680 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128688 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128698 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128707 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4a409a-4230-42ca-bfcc-f014064cbc6c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128719 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/241dc417-3176-4051-ad4e-d98f4f66ddc2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128730 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128738 4962 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-logs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.128770 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.129464 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.129535 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts podName:812fea74-e4e5-4550-8a20-8fe04752a016 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:17.629509632 +0000 UTC m=+1389.211981478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts") pod "root-account-create-update-6f6vb" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016") : configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.185758 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.200239 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28437fcd-377a-4b9e-9a28-e01c21e2ad1f" path="/var/lib/kubelet/pods/28437fcd-377a-4b9e-9a28-e01c21e2ad1f/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.207988 4962 scope.go:117] "RemoveContainer" containerID="42f878706ae1a7e2114a67d56b43328ffc07645b6f77f8f9d20b6c4a2aec6632" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.208845 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.230326 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2e7f05-f1f0-4619-ae07-0a7b93ad6408" path="/var/lib/kubelet/pods/2d2e7f05-f1f0-4619-ae07-0a7b93ad6408/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.233347 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37eccece-549c-4b2f-b066-481b216d7ece" path="/var/lib/kubelet/pods/37eccece-549c-4b2f-b066-481b216d7ece/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.235097 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a879cb3-19b4-4767-8640-993cc47dc7ed" path="/var/lib/kubelet/pods/4a879cb3-19b4-4767-8640-993cc47dc7ed/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.237452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559addbd-1bc6-4146-9a27-ce3e1d3d08fd" path="/var/lib/kubelet/pods/559addbd-1bc6-4146-9a27-ce3e1d3d08fd/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.240344 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.241752 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.241371 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f35bada-015d-4051-9976-d5dfe3a93216" path="/var/lib/kubelet/pods/7f35bada-015d-4051-9976-d5dfe3a93216/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.243158 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd889b7-1b72-4e57-ad0f-85facbad8da4" path="/var/lib/kubelet/pods/8dd889b7-1b72-4e57-ad0f-85facbad8da4/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.249087 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c97128d-8360-482e-b05b-6025d046c122" path="/var/lib/kubelet/pods/9c97128d-8360-482e-b05b-6025d046c122/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.257237 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d903f3-8f86-49e2-848b-4a59a9068b75" path="/var/lib/kubelet/pods/a3d903f3-8f86-49e2-848b-4a59a9068b75/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.260821 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90d5126-d89a-42e6-9b7d-bfc53475bc56" path="/var/lib/kubelet/pods/c90d5126-d89a-42e6-9b7d-bfc53475bc56/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.262099 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.269012 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="galera" containerID="cri-o://0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b" gracePeriod=30 Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.277258 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.277617 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca18a27-31bc-440b-a4a9-517b3323bb91" path="/var/lib/kubelet/pods/cca18a27-31bc-440b-a4a9-517b3323bb91/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.282065 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00" path="/var/lib/kubelet/pods/fdf8f82d-76e8-4d49-ab1f-bc75cec4dc00/volumes" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.287241 4962 scope.go:117] "RemoveContainer" containerID="68a406d2a6eadc4116c120af687c887ef22a20b066ec54d2d6991bd97aaef0e9" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.287466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data" (OuterVolumeSpecName: "config-data") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.291886 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.291933 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.291953 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.302074 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4f4a409a-4230-42ca-bfcc-f014064cbc6c" (UID: "4f4a409a-4230-42ca-bfcc-f014064cbc6c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.304975 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data" (OuterVolumeSpecName: "config-data") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.323822 4962 scope.go:117] "RemoveContainer" containerID="7c19f6ab819e8b088592bd7831817812900bca1c0cc3649a9662bfcc1aa1ae48" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.341199 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.343562 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") pod \"ce62af15-166f-4f74-a244-2de5147a4b2f\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.343649 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") pod \"ce62af15-166f-4f74-a244-2de5147a4b2f\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.343708 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") pod \"ce62af15-166f-4f74-a244-2de5147a4b2f\" (UID: \"ce62af15-166f-4f74-a244-2de5147a4b2f\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.344101 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.344122 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.344134 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.344144 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4a409a-4230-42ca-bfcc-f014064cbc6c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.347789 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp" (OuterVolumeSpecName: "kube-api-access-px4mp") pod "ce62af15-166f-4f74-a244-2de5147a4b2f" (UID: "ce62af15-166f-4f74-a244-2de5147a4b2f"). InnerVolumeSpecName "kube-api-access-px4mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.358415 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "10c1a487-1a74-4994-9b39-f05cbe0fa5c7" (UID: "10c1a487-1a74-4994-9b39-f05cbe0fa5c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.379221 4962 scope.go:117] "RemoveContainer" containerID="aa52f40e409ac825205d183f70f7cf56df81e106f777a2fe46a3166fb938361b" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.379385 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.386112 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.391826 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce62af15-166f-4f74-a244-2de5147a4b2f" (UID: "ce62af15-166f-4f74-a244-2de5147a4b2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.397077 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data" (OuterVolumeSpecName: "config-data") pod "ce62af15-166f-4f74-a244-2de5147a4b2f" (UID: "ce62af15-166f-4f74-a244-2de5147a4b2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.401874 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.409240 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.442169 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.445901 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.452675 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.452753 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.453353 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") pod \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.453416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") pod \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.454270 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") pod \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.454527 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") pod \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\" (UID: \"cffca43e-3e19-4430-8fe2-ca7cfe6229b0\") " Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.455050 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c1a487-1a74-4994-9b39-f05cbe0fa5c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.455084 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.455095 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce62af15-166f-4f74-a244-2de5147a4b2f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.455105 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px4mp\" (UniqueName: \"kubernetes.io/projected/ce62af15-166f-4f74-a244-2de5147a4b2f-kube-api-access-px4mp\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.458405 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm" (OuterVolumeSpecName: "kube-api-access-s7dpm") pod "cffca43e-3e19-4430-8fe2-ca7cfe6229b0" (UID: "cffca43e-3e19-4430-8fe2-ca7cfe6229b0"). InnerVolumeSpecName "kube-api-access-s7dpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.481056 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "cffca43e-3e19-4430-8fe2-ca7cfe6229b0" (UID: "cffca43e-3e19-4430-8fe2-ca7cfe6229b0"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.488451 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cffca43e-3e19-4430-8fe2-ca7cfe6229b0" (UID: "cffca43e-3e19-4430-8fe2-ca7cfe6229b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.513756 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "cffca43e-3e19-4430-8fe2-ca7cfe6229b0" (UID: "cffca43e-3e19-4430-8fe2-ca7cfe6229b0"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556847 4962 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556860 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7dpm\" (UniqueName: \"kubernetes.io/projected/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-api-access-s7dpm\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556872 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.556881 4962 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cffca43e-3e19-4430-8fe2-ca7cfe6229b0-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.556963 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.557024 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data podName:56a77dd3-ef10-46a6-a00d-ab38af0d4338 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:25.557004006 +0000 UTC m=+1397.139475852 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data") pod "rabbitmq-cell1-server-0" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338") : configmap "rabbitmq-cell1-config-data" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.557372 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.557394 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:18.557387728 +0000 UTC m=+1390.139859574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.629691 4962 scope.go:117] "RemoveContainer" containerID="fd4f315997ddf00a356a9ec5e5c2864b8fa25408200a3b8ba03172b2cebc87ed" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.658195 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.658676 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.658804 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts podName:812fea74-e4e5-4550-8a20-8fe04752a016 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:18.658776882 +0000 UTC m=+1390.241248728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts") pod "root-account-create-update-6f6vb" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016") : configmap "openstack-scripts" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.669793 4962 projected.go:194] Error preparing data for projected volume kube-api-access-zgjrp for pod openstack/keystone-125a-account-create-update-rtszm: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.669940 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:18.669902716 +0000 UTC m=+1390.252374572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zgjrp" (UniqueName: "kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.670682 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.675666 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-84464996cb-fhnvz"] Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.722177 4962 scope.go:117] "RemoveContainer" containerID="2063db6c0681c99c5af22bd280759565fe6f153460080e6e822a7af9e9e7ff12" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.786837 4962 scope.go:117] "RemoveContainer" containerID="f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.825729 4962 scope.go:117] "RemoveContainer" containerID="c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.858432 4962 scope.go:117] "RemoveContainer" containerID="f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.871398 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f\": container with ID starting with f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f not found: ID does not exist" containerID="f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.871462 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f"} err="failed to get container status \"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f\": rpc error: code = NotFound desc = could not find container \"f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f\": container with ID starting with f92044b60ad417db828d85a5c41a02658d594ecad6f7c6c0f3f8b1bce358c93f not found: ID does not exist" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.871498 4962 scope.go:117] "RemoveContainer" containerID="c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.872375 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b\": container with ID starting with c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b not found: ID does not exist" containerID="c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.872501 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b"} err="failed to get container status \"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b\": rpc error: code = NotFound desc = could not find container \"c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b\": container with ID starting with c5f1f67dc9e07d9eeb3bb7bd374b8b7f7c3676f58bea6766635a8f614df5e26b not found: ID does not exist" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.872551 4962 scope.go:117] "RemoveContainer" containerID="2fdedd716304d48ca972e72c6c0a4e94560cd57ce8c5b0409e88600b50604c0b" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.895721 4962 scope.go:117] "RemoveContainer" containerID="fdc035dec22a8cb1cbe15ddbb643e583e6ad19e8deec930029ff3031763b1c89" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.927782 4962 scope.go:117] "RemoveContainer" containerID="e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.953841 4962 scope.go:117] "RemoveContainer" containerID="a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.979493 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.984244 4962 scope.go:117] "RemoveContainer" containerID="e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.984729 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515\": container with ID starting with e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515 not found: ID does not exist" containerID="e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.984809 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515"} err="failed to get container status \"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515\": rpc error: code = NotFound desc = could not find container \"e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515\": container with ID starting with e01ecca1dc871afc69109d7af822c9e7b8f02440c8c8b1e92b5ae942c411e515 not found: ID does not exist" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.984868 4962 scope.go:117] "RemoveContainer" containerID="a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" Feb 20 10:18:17 crc kubenswrapper[4962]: E0220 10:18:17.985266 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2\": container with ID starting with a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2 not found: ID does not exist" containerID="a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2" Feb 20 10:18:17 crc kubenswrapper[4962]: I0220 10:18:17.985298 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2"} err="failed to get container status \"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2\": rpc error: code = NotFound desc = could not find container \"a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2\": container with ID starting with a294bb381baa23da8817ec86f599da2c728e47f08a22b5ce88cb75ec5dd531c2 not found: ID does not exist" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.066970 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.067044 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.067126 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.067823 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.067913 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") pod \"b22a9e86-ccdf-4505-8116-21b0230943fc\" (UID: \"b22a9e86-ccdf-4505-8116-21b0230943fc\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.075483 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data" (OuterVolumeSpecName: "config-data") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.076572 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.078967 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll" (OuterVolumeSpecName: "kube-api-access-65mll") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "kube-api-access-65mll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.087077 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_33d73a04-08b2-4944-861f-749a63c2565d/ovn-northd/0.log" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.087158 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.122441 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.136438 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.142645 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "b22a9e86-ccdf-4505-8116-21b0230943fc" (UID: "b22a9e86-ccdf-4505-8116-21b0230943fc"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.151140 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ce62af15-166f-4f74-a244-2de5147a4b2f","Type":"ContainerDied","Data":"4fa1fbefe8085f86ec2949fb3171b5df9f6211664e4db89dbc2b776f71f19d88"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.151211 4962 scope.go:117] "RemoveContainer" containerID="2c8825e8a9845de45acba0c5ed58a1b7ada6575701e9497362444d09cc2e5592" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.151158 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.163339 4962 generic.go:334] "Generic (PLEG): container finished" podID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerID="2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" exitCode=0 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.163450 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b22a9e86-ccdf-4505-8116-21b0230943fc","Type":"ContainerDied","Data":"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.163451 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.163481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b22a9e86-ccdf-4505-8116-21b0230943fc","Type":"ContainerDied","Data":"527bc0b9350edbbd23edfe05a933e12b44f8d4ad0c70495feffaffb9052c4070"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.171797 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.171881 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.171989 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172034 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172053 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172125 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172167 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") pod \"33d73a04-08b2-4944-861f-749a63c2565d\" (UID: \"33d73a04-08b2-4944-861f-749a63c2565d\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172523 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172542 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172552 4962 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22a9e86-ccdf-4505-8116-21b0230943fc-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172562 4962 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b22a9e86-ccdf-4505-8116-21b0230943fc-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.172574 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65mll\" (UniqueName: \"kubernetes.io/projected/b22a9e86-ccdf-4505-8116-21b0230943fc-kube-api-access-65mll\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.173023 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts" (OuterVolumeSpecName: "scripts") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.173163 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config" (OuterVolumeSpecName: "config") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.173901 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.180245 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2" (OuterVolumeSpecName: "kube-api-access-r87b2") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "kube-api-access-r87b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.210186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cffca43e-3e19-4430-8fe2-ca7cfe6229b0","Type":"ContainerDied","Data":"2851b19111bcc172daacd941571725296e0313b2b3496256066714262e7d3b9a"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.210313 4962 scope.go:117] "RemoveContainer" containerID="2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.210400 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.217006 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219034 4962 generic.go:334] "Generic (PLEG): container finished" podID="fae69c76-754d-4125-a405-23a3938e90a9" containerID="ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" exitCode=0 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219064 4962 generic.go:334] "Generic (PLEG): container finished" podID="fae69c76-754d-4125-a405-23a3938e90a9" containerID="6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" exitCode=2 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219072 4962 generic.go:334] "Generic (PLEG): container finished" podID="fae69c76-754d-4125-a405-23a3938e90a9" containerID="cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" exitCode=0 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219116 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219144 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.219154 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.220857 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_33d73a04-08b2-4944-861f-749a63c2565d/ovn-northd/0.log" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.220889 4962 generic.go:334] "Generic (PLEG): container finished" podID="33d73a04-08b2-4944-861f-749a63c2565d" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" exitCode=139 Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.220928 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerDied","Data":"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.220945 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"33d73a04-08b2-4944-861f-749a63c2565d","Type":"ContainerDied","Data":"5c36b8026a940c293c08dfca1df88e2b23028519f85595786058a0396a1ade5b"} Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.221003 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.243094 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.245284 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.261702 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.267818 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.269316 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274140 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274173 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274182 4962 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33d73a04-08b2-4944-861f-749a63c2565d-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274190 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d73a04-08b2-4944-861f-749a63c2565d-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274200 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r87b2\" (UniqueName: \"kubernetes.io/projected/33d73a04-08b2-4944-861f-749a63c2565d-kube-api-access-r87b2\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.274211 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.277458 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.282921 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.293779 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.299814 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.304370 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "33d73a04-08b2-4944-861f-749a63c2565d" (UID: "33d73a04-08b2-4944-861f-749a63c2565d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.312824 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.337095 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.348282 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.354691 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.360312 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.375814 4962 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33d73a04-08b2-4944-861f-749a63c2565d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.422352 4962 scope.go:117] "RemoveContainer" containerID="2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.465836 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54\": container with ID starting with 2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54 not found: ID does not exist" containerID="2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.465890 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54"} err="failed to get container status \"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54\": rpc error: code = NotFound desc = could not find container \"2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54\": container with ID starting with 2a752c83576acea3c58ce68803e2686311938e06421f4eea4dda081f9f3b8c54 not found: ID does not exist" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.465923 4962 scope.go:117] "RemoveContainer" containerID="490c8746de0bc6e3f4ef0520b2658d4424532e972e69bd55a421dfcd9ed32cf4" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.537049 4962 scope.go:117] "RemoveContainer" containerID="0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.561874 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.567300 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.588260 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.589203 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.589275 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:20.589255284 +0000 UTC m=+1392.171727130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : configmap "openstack-scripts" not found Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.649984 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.651052 4962 scope.go:117] "RemoveContainer" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.692989 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") pod \"keystone-125a-account-create-update-rtszm\" (UID: \"0991ff2f-16e5-4891-a38d-8cb9e4b016ec\") " pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.693455 4962 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.693517 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts podName:812fea74-e4e5-4550-8a20-8fe04752a016 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:20.693493886 +0000 UTC m=+1392.275965732 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts") pod "root-account-create-update-6f6vb" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016") : configmap "openstack-scripts" not found Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.696580 4962 projected.go:194] Error preparing data for projected volume kube-api-access-zgjrp for pod openstack/keystone-125a-account-create-update-rtszm: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.697121 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp podName:0991ff2f-16e5-4891-a38d-8cb9e4b016ec nodeName:}" failed. No retries permitted until 2026-02-20 10:18:20.697063187 +0000 UTC m=+1392.279535033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zgjrp" (UniqueName: "kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp") pod "keystone-125a-account-create-update-rtszm" (UID: "0991ff2f-16e5-4891-a38d-8cb9e4b016ec") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.713306 4962 scope.go:117] "RemoveContainer" containerID="0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.716842 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453\": container with ID starting with 0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453 not found: ID does not exist" containerID="0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.716899 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453"} err="failed to get container status \"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453\": rpc error: code = NotFound desc = could not find container \"0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453\": container with ID starting with 0053432ef3fdc770bbcfaedc758ae1d1941eb3f0d4d0ebcb6d983082d7938453 not found: ID does not exist" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.716931 4962 scope.go:117] "RemoveContainer" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" Feb 20 10:18:18 crc kubenswrapper[4962]: E0220 10:18:18.717541 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe\": container with ID starting with 095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe not found: ID does not exist" containerID="095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.717703 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe"} err="failed to get container status \"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe\": rpc error: code = NotFound desc = could not find container \"095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe\": container with ID starting with 095ea16654e1756b3ffb7fcf3eb9dc6ba35b4333c92bf90d3619d8cb9c0062fe not found: ID does not exist" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.794356 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") pod \"812fea74-e4e5-4550-8a20-8fe04752a016\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.794688 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") pod \"812fea74-e4e5-4550-8a20-8fe04752a016\" (UID: \"812fea74-e4e5-4550-8a20-8fe04752a016\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.795995 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "812fea74-e4e5-4550-8a20-8fe04752a016" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.801430 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff" (OuterVolumeSpecName: "kube-api-access-5qvff") pod "812fea74-e4e5-4550-8a20-8fe04752a016" (UID: "812fea74-e4e5-4550-8a20-8fe04752a016"). InnerVolumeSpecName "kube-api-access-5qvff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.801538 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896385 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896456 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896491 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896561 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896609 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896628 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896712 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896781 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896799 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.896875 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") pod \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\" (UID: \"56a77dd3-ef10-46a6-a00d-ab38af0d4338\") " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897109 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897236 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qvff\" (UniqueName: \"kubernetes.io/projected/812fea74-e4e5-4550-8a20-8fe04752a016-kube-api-access-5qvff\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897250 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/812fea74-e4e5-4550-8a20-8fe04752a016-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897346 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.897772 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.902767 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.902804 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.902839 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.902945 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp" (OuterVolumeSpecName: "kube-api-access-2hckp") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "kube-api-access-2hckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.903283 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info" (OuterVolumeSpecName: "pod-info") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.927266 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data" (OuterVolumeSpecName: "config-data") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.946737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf" (OuterVolumeSpecName: "server-conf") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.989440 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "56a77dd3-ef10-46a6-a00d-ab38af0d4338" (UID: "56a77dd3-ef10-46a6-a00d-ab38af0d4338"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999344 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999383 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999394 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999429 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999442 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999451 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56a77dd3-ef10-46a6-a00d-ab38af0d4338-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999459 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56a77dd3-ef10-46a6-a00d-ab38af0d4338-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999468 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999479 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hckp\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-kube-api-access-2hckp\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999489 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56a77dd3-ef10-46a6-a00d-ab38af0d4338-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:18 crc kubenswrapper[4962]: I0220 10:18:18.999500 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56a77dd3-ef10-46a6-a00d-ab38af0d4338-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.021010 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.101641 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.155334 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" path="/var/lib/kubelet/pods/10c1a487-1a74-4994-9b39-f05cbe0fa5c7/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.156041 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" path="/var/lib/kubelet/pods/241dc417-3176-4051-ad4e-d98f4f66ddc2/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.163068 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d73a04-08b2-4944-861f-749a63c2565d" path="/var/lib/kubelet/pods/33d73a04-08b2-4944-861f-749a63c2565d/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.164550 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" path="/var/lib/kubelet/pods/4f4a409a-4230-42ca-bfcc-f014064cbc6c/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.166174 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89dbdc4c-bf31-402e-b5bf-e8bbb8c16172" path="/var/lib/kubelet/pods/89dbdc4c-bf31-402e-b5bf-e8bbb8c16172/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.168433 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" path="/var/lib/kubelet/pods/b22a9e86-ccdf-4505-8116-21b0230943fc/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.169553 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba9a9d46-9ba9-428c-8864-a8db8bca2b57" path="/var/lib/kubelet/pods/ba9a9d46-9ba9-428c-8864-a8db8bca2b57/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.171915 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca793428-98ed-4f82-aa57-31d6671d546c" path="/var/lib/kubelet/pods/ca793428-98ed-4f82-aa57-31d6671d546c/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.172900 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" path="/var/lib/kubelet/pods/ce62af15-166f-4f74-a244-2de5147a4b2f/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.173700 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" path="/var/lib/kubelet/pods/cffca43e-3e19-4430-8fe2-ca7cfe6229b0/volumes" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.258749 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.259014 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.259372 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.259491 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259722 4962 generic.go:334] "Generic (PLEG): container finished" podID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerID="89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" exitCode=0 Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259790 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259808 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerDied","Data":"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718"} Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259847 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56a77dd3-ef10-46a6-a00d-ab38af0d4338","Type":"ContainerDied","Data":"1af76abb62f306cbfdd579814518b0ea666f529247ac9e64fd984f69498132b5"} Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.259870 4962 scope.go:117] "RemoveContainer" containerID="89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.261954 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.265944 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.269170 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6f6vb" event={"ID":"812fea74-e4e5-4550-8a20-8fe04752a016","Type":"ContainerDied","Data":"b35105a6f1f09300973fb51f5cc2ceed7e4acc42cd81be4a5215ef08b873fcd8"} Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.269193 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6f6vb" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.269315 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.269370 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.285897 4962 generic.go:334] "Generic (PLEG): container finished" podID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerID="0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b" exitCode=0 Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.286101 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerDied","Data":"0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b"} Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.300344 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-125a-account-create-update-rtszm" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.306458 4962 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.306522 4962 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data podName:2a8d652d-aea8-4a83-b33e-0d2522af0be8 nodeName:}" failed. No retries permitted until 2026-02-20 10:18:27.306505375 +0000 UTC m=+1398.888977221 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data") pod "rabbitmq-server-0" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8") : configmap "rabbitmq-config-data" not found Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.340207 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.348601 4962 scope.go:117] "RemoveContainer" containerID="565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.350087 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.356197 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.361482 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6f6vb"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.374251 4962 scope.go:117] "RemoveContainer" containerID="89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.374678 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718\": container with ID starting with 89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718 not found: ID does not exist" containerID="89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.374709 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718"} err="failed to get container status \"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718\": rpc error: code = NotFound desc = could not find container \"89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718\": container with ID starting with 89f21e0f9ed8c4de881b1add4cca2f3108cbffd0cc9fe288bcc483e30d1f1718 not found: ID does not exist" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.374733 4962 scope.go:117] "RemoveContainer" containerID="565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857" Feb 20 10:18:19 crc kubenswrapper[4962]: E0220 10:18:19.375097 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857\": container with ID starting with 565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857 not found: ID does not exist" containerID="565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.375119 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857"} err="failed to get container status \"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857\": rpc error: code = NotFound desc = could not find container \"565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857\": container with ID starting with 565584a6c8c851ef4d74b724c7d45c8dd9c73a6da0c33f9bfe51852abd444857 not found: ID does not exist" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.375137 4962 scope.go:117] "RemoveContainer" containerID="156621efed4a83b0a1598b9e193e1ba9bb7c448ebc2a41320d1b53c4756507b6" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.385101 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.396938 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-125a-account-create-update-rtszm"] Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.510492 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.510978 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgjrp\" (UniqueName: \"kubernetes.io/projected/0991ff2f-16e5-4891-a38d-8cb9e4b016ec-kube-api-access-zgjrp\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.626144 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715133 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715211 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715333 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715365 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715415 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715438 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.715469 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") pod \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\" (UID: \"6e766bfd-869d-43ca-bf11-cf4ec9fa253a\") " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.716492 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.717113 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.717471 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.717774 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.723689 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4" (OuterVolumeSpecName: "kube-api-access-4zkm4") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "kube-api-access-4zkm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.733379 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.759160 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.788302 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "6e766bfd-869d-43ca-bf11-cf4ec9fa253a" (UID: "6e766bfd-869d-43ca-bf11-cf4ec9fa253a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817487 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817522 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zkm4\" (UniqueName: \"kubernetes.io/projected/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kube-api-access-4zkm4\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817533 4962 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817543 4962 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817572 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817582 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817618 4962 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.817629 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e766bfd-869d-43ca-bf11-cf4ec9fa253a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.833438 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.919379 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:19 crc kubenswrapper[4962]: I0220 10:18:19.973901 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.057547 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.124403 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.124892 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.124929 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.124972 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125009 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125043 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125068 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125094 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125158 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125188 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125213 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125306 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125354 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125418 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125450 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125483 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125466 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125513 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125533 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125557 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") pod \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\" (UID: \"2a8d652d-aea8-4a83-b33e-0d2522af0be8\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125742 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125956 4962 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.125968 4962 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.126258 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.126649 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.129010 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.130133 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info" (OuterVolumeSpecName: "pod-info") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.131436 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.131723 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts" (OuterVolumeSpecName: "scripts") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.133307 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.134279 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.146474 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk" (OuterVolumeSpecName: "kube-api-access-rcvhk") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "kube-api-access-rcvhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.146615 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh" (OuterVolumeSpecName: "kube-api-access-mvmhh") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "kube-api-access-mvmhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.205209 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.211496 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf" (OuterVolumeSpecName: "server-conf") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.212023 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data" (OuterVolumeSpecName: "config-data") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227827 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227861 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227873 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvmhh\" (UniqueName: \"kubernetes.io/projected/fae69c76-754d-4125-a405-23a3938e90a9-kube-api-access-mvmhh\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227886 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227894 4962 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2a8d652d-aea8-4a83-b33e-0d2522af0be8-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227903 4962 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2a8d652d-aea8-4a83-b33e-0d2522af0be8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227912 4962 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227920 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227939 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227948 4962 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fae69c76-754d-4125-a405-23a3938e90a9-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227956 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227964 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcvhk\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-kube-api-access-rcvhk\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.227972 4962 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2a8d652d-aea8-4a83-b33e-0d2522af0be8-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.265404 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.314265 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.335798 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data" (OuterVolumeSpecName: "config-data") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.340054 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") pod \"fae69c76-754d-4125-a405-23a3938e90a9\" (UID: \"fae69c76-754d-4125-a405-23a3938e90a9\") " Feb 20 10:18:20 crc kubenswrapper[4962]: W0220 10:18:20.340284 4962 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fae69c76-754d-4125-a405-23a3938e90a9/volumes/kubernetes.io~secret/config-data Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.340348 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data" (OuterVolumeSpecName: "config-data") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.341625 4962 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.341647 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.341663 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.347649 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6e766bfd-869d-43ca-bf11-cf4ec9fa253a","Type":"ContainerDied","Data":"9629cf6fabd95f146380c31c7bc910c7de73918acc62bb7e7fbe72c4774cfa18"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.347724 4962 scope.go:117] "RemoveContainer" containerID="0ee4c6895eaf367e01ee1ab962d5fa0868b6b165760c399d39cc5c1615f1960b" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.347930 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.369235 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fae69c76-754d-4125-a405-23a3938e90a9" (UID: "fae69c76-754d-4125-a405-23a3938e90a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.403310 4962 generic.go:334] "Generic (PLEG): container finished" podID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.403727 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"815f0ef8-a30a-4467-bb56-ff8499a4be44","Type":"ContainerDied","Data":"5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.410808 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2a8d652d-aea8-4a83-b33e-0d2522af0be8" (UID: "2a8d652d-aea8-4a83-b33e-0d2522af0be8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.423057 4962 generic.go:334] "Generic (PLEG): container finished" podID="fae69c76-754d-4125-a405-23a3938e90a9" containerID="ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.423122 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.423155 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fae69c76-754d-4125-a405-23a3938e90a9","Type":"ContainerDied","Data":"498a8615ffc0d02ef23136be3c7f8346a8aa655c4297248e31b8a2413028fcd9"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.423468 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.443918 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae69c76-754d-4125-a405-23a3938e90a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.443949 4962 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2a8d652d-aea8-4a83-b33e-0d2522af0be8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.447273 4962 generic.go:334] "Generic (PLEG): container finished" podID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.447417 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcd02115-2eb9-4090-8225-108c3a8cad20","Type":"ContainerDied","Data":"24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.450248 4962 generic.go:334] "Generic (PLEG): container finished" podID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerID="57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.450314 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b4c54c5d9-pqd8r" event={"ID":"d203fc44-5252-4dd2-98ae-66f9c139b5f5","Type":"ContainerDied","Data":"57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.452580 4962 generic.go:334] "Generic (PLEG): container finished" podID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerID="f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" exitCode=0 Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.452631 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerDied","Data":"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.452648 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2a8d652d-aea8-4a83-b33e-0d2522af0be8","Type":"ContainerDied","Data":"b402e19dca07d8ba27eec1161345a129c0a3f56fa63c23ac6f8b1e82180c9e7c"} Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.452732 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.506434 4962 scope.go:117] "RemoveContainer" containerID="5738934c1190f3f4ebf6be3609b1f56189c1c53ad8ccc9348121e92913c3ec72" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.547543 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.571876 4962 scope.go:117] "RemoveContainer" containerID="ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.590289 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.605185 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.610474 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.615030 4962 scope.go:117] "RemoveContainer" containerID="6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.634721 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.643277 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.647206 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.647650 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") pod \"dcd02115-2eb9-4090-8225-108c3a8cad20\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.647703 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") pod \"dcd02115-2eb9-4090-8225-108c3a8cad20\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.647776 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") pod \"dcd02115-2eb9-4090-8225-108c3a8cad20\" (UID: \"dcd02115-2eb9-4090-8225-108c3a8cad20\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.655876 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6" (OuterVolumeSpecName: "kube-api-access-qvxk6") pod "dcd02115-2eb9-4090-8225-108c3a8cad20" (UID: "dcd02115-2eb9-4090-8225-108c3a8cad20"). InnerVolumeSpecName "kube-api-access-qvxk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.663191 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.665907 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.667910 4962 scope.go:117] "RemoveContainer" containerID="ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.685821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data" (OuterVolumeSpecName: "config-data") pod "dcd02115-2eb9-4090-8225-108c3a8cad20" (UID: "dcd02115-2eb9-4090-8225-108c3a8cad20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.686349 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcd02115-2eb9-4090-8225-108c3a8cad20" (UID: "dcd02115-2eb9-4090-8225-108c3a8cad20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.712041 4962 scope.go:117] "RemoveContainer" containerID="cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.737365 4962 scope.go:117] "RemoveContainer" containerID="ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.737945 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5\": container with ID starting with ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5 not found: ID does not exist" containerID="ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738019 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5"} err="failed to get container status \"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5\": rpc error: code = NotFound desc = could not find container \"ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5\": container with ID starting with ee4780834b45dd3df9c5478d7f70a5b55b25c67044bc5c70a1699c36ee7a04a5 not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738066 4962 scope.go:117] "RemoveContainer" containerID="6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.738477 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e\": container with ID starting with 6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e not found: ID does not exist" containerID="6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738523 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e"} err="failed to get container status \"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e\": rpc error: code = NotFound desc = could not find container \"6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e\": container with ID starting with 6f8330d1d14a32a3610f17948811d4a9c71b61fcf7b72a4769e4f03066b35b1e not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738558 4962 scope.go:117] "RemoveContainer" containerID="ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.738961 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917\": container with ID starting with ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917 not found: ID does not exist" containerID="ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.738990 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917"} err="failed to get container status \"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917\": rpc error: code = NotFound desc = could not find container \"ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917\": container with ID starting with ecce7d5cc120360c76c90c0a94a6162a452698a47d63fb49fa2ed866e4ad8917 not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.739003 4962 scope.go:117] "RemoveContainer" containerID="cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.739389 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a\": container with ID starting with cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a not found: ID does not exist" containerID="cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.739411 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a"} err="failed to get container status \"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a\": rpc error: code = NotFound desc = could not find container \"cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a\": container with ID starting with cdb5b15ea05e323a5f856da44e27bde808d02494e9d53ffb4bc777be963ee11a not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.739428 4962 scope.go:117] "RemoveContainer" containerID="f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750039 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750096 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") pod \"815f0ef8-a30a-4467-bb56-ff8499a4be44\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750147 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") pod \"815f0ef8-a30a-4467-bb56-ff8499a4be44\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750215 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750258 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750289 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750322 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") pod \"815f0ef8-a30a-4467-bb56-ff8499a4be44\" (UID: \"815f0ef8-a30a-4467-bb56-ff8499a4be44\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750348 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750377 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750540 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750610 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") pod \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\" (UID: \"d203fc44-5252-4dd2-98ae-66f9c139b5f5\") " Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750906 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750924 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd02115-2eb9-4090-8225-108c3a8cad20-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.750934 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvxk6\" (UniqueName: \"kubernetes.io/projected/dcd02115-2eb9-4090-8225-108c3a8cad20-kube-api-access-qvxk6\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756113 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756168 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9" (OuterVolumeSpecName: "kube-api-access-gzvw9") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "kube-api-access-gzvw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756182 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts" (OuterVolumeSpecName: "scripts") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756208 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h" (OuterVolumeSpecName: "kube-api-access-hql7h") pod "815f0ef8-a30a-4467-bb56-ff8499a4be44" (UID: "815f0ef8-a30a-4467-bb56-ff8499a4be44"). InnerVolumeSpecName "kube-api-access-hql7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.756234 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.765209 4962 scope.go:117] "RemoveContainer" containerID="1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.772450 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "815f0ef8-a30a-4467-bb56-ff8499a4be44" (UID: "815f0ef8-a30a-4467-bb56-ff8499a4be44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.773072 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data" (OuterVolumeSpecName: "config-data") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.778349 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data" (OuterVolumeSpecName: "config-data") pod "815f0ef8-a30a-4467-bb56-ff8499a4be44" (UID: "815f0ef8-a30a-4467-bb56-ff8499a4be44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.783907 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.787925 4962 scope.go:117] "RemoveContainer" containerID="f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.788502 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308\": container with ID starting with f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308 not found: ID does not exist" containerID="f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.788547 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308"} err="failed to get container status \"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308\": rpc error: code = NotFound desc = could not find container \"f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308\": container with ID starting with f6ebb23a577e121e067e03133802f0cd7183161a54f98c2902a217045cadf308 not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.788606 4962 scope.go:117] "RemoveContainer" containerID="1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e" Feb 20 10:18:20 crc kubenswrapper[4962]: E0220 10:18:20.789050 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e\": container with ID starting with 1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e not found: ID does not exist" containerID="1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.789079 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e"} err="failed to get container status \"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e\": rpc error: code = NotFound desc = could not find container \"1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e\": container with ID starting with 1dd7b2604194fcf6002518bb647f90f19a0a23390f083313c0f1248bafe3c51e not found: ID does not exist" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.795820 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.802497 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d203fc44-5252-4dd2-98ae-66f9c139b5f5" (UID: "d203fc44-5252-4dd2-98ae-66f9c139b5f5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853128 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853245 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853298 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hql7h\" (UniqueName: \"kubernetes.io/projected/815f0ef8-a30a-4467-bb56-ff8499a4be44-kube-api-access-hql7h\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853310 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853320 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853328 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853339 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzvw9\" (UniqueName: \"kubernetes.io/projected/d203fc44-5252-4dd2-98ae-66f9c139b5f5-kube-api-access-gzvw9\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853349 4962 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853360 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853368 4962 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815f0ef8-a30a-4467-bb56-ff8499a4be44-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:20 crc kubenswrapper[4962]: I0220 10:18:20.853378 4962 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d203fc44-5252-4dd2-98ae-66f9c139b5f5-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.161445 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0991ff2f-16e5-4891-a38d-8cb9e4b016ec" path="/var/lib/kubelet/pods/0991ff2f-16e5-4891-a38d-8cb9e4b016ec/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.162544 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" path="/var/lib/kubelet/pods/2a8d652d-aea8-4a83-b33e-0d2522af0be8/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.164285 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" path="/var/lib/kubelet/pods/56a77dd3-ef10-46a6-a00d-ab38af0d4338/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.166917 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" path="/var/lib/kubelet/pods/6e766bfd-869d-43ca-bf11-cf4ec9fa253a/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.168495 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" path="/var/lib/kubelet/pods/812fea74-e4e5-4550-8a20-8fe04752a016/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.169964 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae69c76-754d-4125-a405-23a3938e90a9" path="/var/lib/kubelet/pods/fae69c76-754d-4125-a405-23a3938e90a9/volumes" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.462202 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6b4c54c5d9-pqd8r" event={"ID":"d203fc44-5252-4dd2-98ae-66f9c139b5f5","Type":"ContainerDied","Data":"cf5b12fd788026ff0304070e27b4ebd505b31b0d0e831a4ccd6e51bd8bb0b383"} Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.462469 4962 scope.go:117] "RemoveContainer" containerID="57e3b54a0aaa3e8886ac13c31c98adf640a3207944f14271a7e3dbd0e513db14" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.462222 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6b4c54c5d9-pqd8r" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.474528 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"815f0ef8-a30a-4467-bb56-ff8499a4be44","Type":"ContainerDied","Data":"5925bb54309b7a0a7036656c54ac3f8deef63680ce4f7825beb5965502489453"} Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.474627 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.478989 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcd02115-2eb9-4090-8225-108c3a8cad20","Type":"ContainerDied","Data":"a6be5e2b469a4dd84e09bc3f569eccb10479b9448520269901b4d42cca661dde"} Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.479082 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.495126 4962 scope.go:117] "RemoveContainer" containerID="5986cb792b03a6e15f31fe7f4e91ccaa3ff2a4c360820798809c00e91587dc69" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.508391 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.524379 4962 scope.go:117] "RemoveContainer" containerID="24e611c94f3db833be2f4d2218a68d358affbfa3d1fc3a15c508caceb7974666" Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.529790 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6b4c54c5d9-pqd8r"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.554666 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.569653 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.576668 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:18:21 crc kubenswrapper[4962]: I0220 10:18:21.579084 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 10:18:23 crc kubenswrapper[4962]: I0220 10:18:23.178902 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" path="/var/lib/kubelet/pods/815f0ef8-a30a-4467-bb56-ff8499a4be44/volumes" Feb 20 10:18:23 crc kubenswrapper[4962]: I0220 10:18:23.182938 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" path="/var/lib/kubelet/pods/d203fc44-5252-4dd2-98ae-66f9c139b5f5/volumes" Feb 20 10:18:23 crc kubenswrapper[4962]: I0220 10:18:23.184437 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" path="/var/lib/kubelet/pods/dcd02115-2eb9-4090-8225-108c3a8cad20/volumes" Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.257916 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.260081 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.260272 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.262066 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.262153 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.266159 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.268085 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:24 crc kubenswrapper[4962]: E0220 10:18:24.268138 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.258277 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.259572 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.260069 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.260173 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.262424 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.268881 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.276743 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:29 crc kubenswrapper[4962]: E0220 10:18:29.276818 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:31 crc kubenswrapper[4962]: I0220 10:18:31.599705 4962 generic.go:334] "Generic (PLEG): container finished" podID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerID="45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3" exitCode=0 Feb 20 10:18:31 crc kubenswrapper[4962]: I0220 10:18:31.599822 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerDied","Data":"45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3"} Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.162580 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301259 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301300 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301328 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301360 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301390 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301474 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.301512 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") pod \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\" (UID: \"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3\") " Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.316771 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.317359 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn" (OuterVolumeSpecName: "kube-api-access-6kdsn") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "kube-api-access-6kdsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.342173 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config" (OuterVolumeSpecName: "config") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.342468 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.354314 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.375747 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.385819 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" (UID: "a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403432 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403462 4962 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403472 4962 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403481 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kdsn\" (UniqueName: \"kubernetes.io/projected/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-kube-api-access-6kdsn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403491 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403500 4962 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.403508 4962 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.617526 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dfd6b5f7f-dkfsl" event={"ID":"a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3","Type":"ContainerDied","Data":"4d1717c6f2d95b6886c02fd175b76f1f1a5915a4672d75c1da15401a0d992411"} Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.617652 4962 scope.go:117] "RemoveContainer" containerID="731c2e1dae94781e12c80ac05ffd0b3634529739ec574c2b3459d53ff4dd175f" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.617668 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dfd6b5f7f-dkfsl" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.713082 4962 scope.go:117] "RemoveContainer" containerID="45e715a9f15469232fd9eda659480065c452b6d474e0d50459f16eb16fcf18e3" Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.717119 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:18:32 crc kubenswrapper[4962]: I0220 10:18:32.727100 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5dfd6b5f7f-dkfsl"] Feb 20 10:18:33 crc kubenswrapper[4962]: I0220 10:18:33.169681 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" path="/var/lib/kubelet/pods/a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3/volumes" Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.258010 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.259299 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.260120 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.260202 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.260431 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.262427 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.264713 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:34 crc kubenswrapper[4962]: E0220 10:18:34.264768 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.257997 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.259091 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.260004 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.260054 4962 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.261094 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.263495 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.265840 4962 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 10:18:39 crc kubenswrapper[4962]: E0220 10:18:39.265959 4962 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-r7g9h" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:18:40 crc kubenswrapper[4962]: I0220 10:18:40.739965 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r7g9h_8e8425d5-32be-4726-915a-3de5c70f0f62/ovs-vswitchd/0.log" Feb 20 10:18:40 crc kubenswrapper[4962]: I0220 10:18:40.742322 4962 generic.go:334] "Generic (PLEG): container finished" podID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" exitCode=137 Feb 20 10:18:40 crc kubenswrapper[4962]: I0220 10:18:40.742370 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerDied","Data":"fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38"} Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.125832 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r7g9h_8e8425d5-32be-4726-915a-3de5c70f0f62/ovs-vswitchd/0.log" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.127274 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274227 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274353 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274391 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274439 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274486 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274459 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log" (OuterVolumeSpecName: "var-log") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274546 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") pod \"8e8425d5-32be-4726-915a-3de5c70f0f62\" (UID: \"8e8425d5-32be-4726-915a-3de5c70f0f62\") " Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274586 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274681 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run" (OuterVolumeSpecName: "var-run") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.274812 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib" (OuterVolumeSpecName: "var-lib") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.275003 4962 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-lib\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.275021 4962 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-log\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.275030 4962 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.275041 4962 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e8425d5-32be-4726-915a-3de5c70f0f62-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.276428 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts" (OuterVolumeSpecName: "scripts") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.283498 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc" (OuterVolumeSpecName: "kube-api-access-z44jc") pod "8e8425d5-32be-4726-915a-3de5c70f0f62" (UID: "8e8425d5-32be-4726-915a-3de5c70f0f62"). InnerVolumeSpecName "kube-api-access-z44jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.383046 4962 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e8425d5-32be-4726-915a-3de5c70f0f62-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.383106 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z44jc\" (UniqueName: \"kubernetes.io/projected/8e8425d5-32be-4726-915a-3de5c70f0f62-kube-api-access-z44jc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.765244 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r7g9h_8e8425d5-32be-4726-915a-3de5c70f0f62/ovs-vswitchd/0.log" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.767094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r7g9h" event={"ID":"8e8425d5-32be-4726-915a-3de5c70f0f62","Type":"ContainerDied","Data":"b4d03ac8272f687d64246b8c3c40efcac57552a3657ef2ee1db4c3625f47035c"} Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.767185 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r7g9h" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.767193 4962 scope.go:117] "RemoveContainer" containerID="fbca6026ebd221992e1ebc24844b7bb1692f49e72896c063a823730a2cadaf38" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.778957 4962 generic.go:334] "Generic (PLEG): container finished" podID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerID="63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141" exitCode=137 Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.779024 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141"} Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.800753 4962 scope.go:117] "RemoveContainer" containerID="0720b3e23c471cc00067da0dcc3ef5606323f92f8c20556ce1e295ff4f90dae2" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.833104 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.842222 4962 scope.go:117] "RemoveContainer" containerID="b6626b3616a8427737e8c790adcc57ad3f4d0385df8b472ffc49fd4bd021b003" Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.849582 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-r7g9h"] Feb 20 10:18:41 crc kubenswrapper[4962]: I0220 10:18:41.988853 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.092903 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.092984 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.093060 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.093154 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.093224 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.093290 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") pod \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\" (UID: \"f4fb3b99-0e02-4c5c-9704-884ea3f0605d\") " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.094517 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock" (OuterVolumeSpecName: "lock") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.095117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache" (OuterVolumeSpecName: "cache") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.100832 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.101446 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn" (OuterVolumeSpecName: "kube-api-access-57kkn") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "kube-api-access-57kkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.101898 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.195433 4962 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-cache\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.196530 4962 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.196568 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57kkn\" (UniqueName: \"kubernetes.io/projected/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-kube-api-access-57kkn\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.196590 4962 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-lock\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.196666 4962 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.225223 4962 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.299077 4962 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.457301 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4fb3b99-0e02-4c5c-9704-884ea3f0605d" (UID: "f4fb3b99-0e02-4c5c-9704-884ea3f0605d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.502749 4962 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fb3b99-0e02-4c5c-9704-884ea3f0605d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.798247 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f4fb3b99-0e02-4c5c-9704-884ea3f0605d","Type":"ContainerDied","Data":"7ba9f2cadbb43f65e2484ea2a7184348cefa8eeb550b59455e6840526a2111e5"} Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.798338 4962 scope.go:117] "RemoveContainer" containerID="63c4d35ae203bd5ac342fa6d490352730d135f847a680bbe15aae0fe53059141" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.798415 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.820117 4962 scope.go:117] "RemoveContainer" containerID="3c297c5e3426f0b38076ba12a36de8e42599c1ec9b371d1d4ac3dc87d286fdac" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.843201 4962 scope.go:117] "RemoveContainer" containerID="6460038d74df47b4bd5e8f877737b675fdcc51257f17732080e42ee0a1e7dfa6" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.856659 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.861461 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.943822 4962 scope.go:117] "RemoveContainer" containerID="3108da3bf591571013cc25e1b8f1de0c827e10b04d9686bc5e1fb47bc9778731" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.967861 4962 scope.go:117] "RemoveContainer" containerID="3f89270dd151567356dcd4569c268792d8ce043f1e81df07ebe5f55f65531bca" Feb 20 10:18:42 crc kubenswrapper[4962]: I0220 10:18:42.991578 4962 scope.go:117] "RemoveContainer" containerID="05aae6f36e27022f7b4fa526f1265b47aeb3c166ab95c682c5b8f4ac82205eff" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.016913 4962 scope.go:117] "RemoveContainer" containerID="b6ead0e1bdda64a7399139dd6191cc696b570349bf204a2ab46ce0d182cc49a9" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.048341 4962 scope.go:117] "RemoveContainer" containerID="87c786369d8da7650fca3be3c67f9a8decb0d8fd88429ab357e31f9e7c19f3e0" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.075587 4962 scope.go:117] "RemoveContainer" containerID="4bc06842128d6fdcb6b37354d4c5aad1c3642acbd05e513b28a95e6f19bab1ca" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.101811 4962 scope.go:117] "RemoveContainer" containerID="8395eb871539c46360c6d66fb96850aeed91819306e7873acf83b98b89a956d8" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.144628 4962 scope.go:117] "RemoveContainer" containerID="6727a65f145335bf540a7898aeabecb549d8d22b6c9a1c79a91620a5e8e3e3f8" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.154236 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" path="/var/lib/kubelet/pods/8e8425d5-32be-4726-915a-3de5c70f0f62/volumes" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.155658 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" path="/var/lib/kubelet/pods/f4fb3b99-0e02-4c5c-9704-884ea3f0605d/volumes" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.172294 4962 scope.go:117] "RemoveContainer" containerID="1b0e56a8482d960b0917a1f3004c6a015099a8313a0f5c4fbb4d166f9d4ea11c" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.203907 4962 scope.go:117] "RemoveContainer" containerID="5d9d68ccd50ca26ce3191d56dc735011eb169a68e6eedc3144c97564be0ff601" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.233269 4962 scope.go:117] "RemoveContainer" containerID="066dce8eb5ee2a5ee4696fbdc5642875edc121ec4465ea32468ecf8aba5fbe36" Feb 20 10:18:43 crc kubenswrapper[4962]: I0220 10:18:43.258459 4962 scope.go:117] "RemoveContainer" containerID="138e05b5e05f4d5ae28d62c69c931e5b6907fd9792450f37e652add9de1e83a1" Feb 20 10:19:11 crc kubenswrapper[4962]: I0220 10:19:11.508894 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:19:11 crc kubenswrapper[4962]: I0220 10:19:11.509432 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.522480 4962 scope.go:117] "RemoveContainer" containerID="109a3b4f30138b426060ee3960875f54b8e50460794fa326f4252e9233232cac" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.558686 4962 scope.go:117] "RemoveContainer" containerID="8e1e57cd49c915d1862d936053074b6280af762ac9dd3bf4c1c80c561fca009f" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.600957 4962 scope.go:117] "RemoveContainer" containerID="a637cdafdb841809ec5f95151c668e1d8c78d29aabd8a60383f137a82dcb2009" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.644545 4962 scope.go:117] "RemoveContainer" containerID="1b60442fa3cb970cd1e3424fd12f2f5e98e959daa205ca4e27a8e01da1487e66" Feb 20 10:19:12 crc kubenswrapper[4962]: I0220 10:19:12.690382 4962 scope.go:117] "RemoveContainer" containerID="1dbe5f8319feef22f1ef43626823510cfe8e71d6a8d49cafca70087ce33b1b60" Feb 20 10:19:41 crc kubenswrapper[4962]: I0220 10:19:41.508451 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:19:41 crc kubenswrapper[4962]: I0220 10:19:41.509244 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.508418 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.509132 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.509208 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.510376 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.510476 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" gracePeriod=600 Feb 20 10:20:11 crc kubenswrapper[4962]: E0220 10:20:11.663686 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.903240 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" exitCode=0 Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.903315 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e"} Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.903370 4962 scope.go:117] "RemoveContainer" containerID="90048224d02357c3a2b79884d1830677ace1a55bff8576575bc2ae41bdccb716" Feb 20 10:20:11 crc kubenswrapper[4962]: I0220 10:20:11.904723 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:20:11 crc kubenswrapper[4962]: E0220 10:20:11.907942 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.570177 4962 scope.go:117] "RemoveContainer" containerID="06ae0aace60b853c3274af8b59ad6fe8fb46d990b1106c40d7696cbaaa47e13b" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.641560 4962 scope.go:117] "RemoveContainer" containerID="03ab33469ea979640d7188e1c0dc68dd1548a99d601929f7b4e160bee72396f3" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.677402 4962 scope.go:117] "RemoveContainer" containerID="aa045e6922dfe4d5b86be77916d3a6f56d92ad5d8849a14be83a3fc1d37883cc" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.704037 4962 scope.go:117] "RemoveContainer" containerID="1255947ebb2d1ff7325c767c453081290e37fc7eec685e64c813cb21e269d2c8" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.776706 4962 scope.go:117] "RemoveContainer" containerID="ae355b88f320e93105b216772d0d1821b9792d4ee89d86649fd430b7ae19d59e" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.836113 4962 scope.go:117] "RemoveContainer" containerID="0be6bfc0db94e6c57e1c0a4856d3600b1ea4d12d42a32685b52156cacc1224a0" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.870863 4962 scope.go:117] "RemoveContainer" containerID="53831e942d8d69707dcfe40655e43c5762a4d492f07b1c79ed7f413953ec5f61" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.905665 4962 scope.go:117] "RemoveContainer" containerID="1933a4410cc57079acebbf3cca845c0c1a3c75df94daefc5b4a3cc61d913faab" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.935382 4962 scope.go:117] "RemoveContainer" containerID="5a9782006ca96cee05b8576db8cf67f09117b6ff20027f1e9a751d12df45c5f2" Feb 20 10:20:13 crc kubenswrapper[4962]: I0220 10:20:13.960568 4962 scope.go:117] "RemoveContainer" containerID="a4453e5e140badbab6aa97996c8ab339f8ab22881b41594395bdb84a3005b466" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.007580 4962 scope.go:117] "RemoveContainer" containerID="1374063c1227f074525aeab9310be0405d817c53450d0331b45011e3f7fb82f7" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.043575 4962 scope.go:117] "RemoveContainer" containerID="c4884098169c655124365602e35fd187fa28c946c8e4d3fb080909fa29ad7ae0" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.095869 4962 scope.go:117] "RemoveContainer" containerID="d8f2683b2b57472d95ac6a22ba161803aef705799c500b173956e9aa04929fde" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.128268 4962 scope.go:117] "RemoveContainer" containerID="7fd77ce11ed465ec4237e46a1c362e414960d4a8e3a2e89e44d3a98f1d109ea9" Feb 20 10:20:14 crc kubenswrapper[4962]: I0220 10:20:14.153182 4962 scope.go:117] "RemoveContainer" containerID="2c027b22cf0ba460d458ecf5143a855bd6cabc995b34bcff27678d1a95ac71b9" Feb 20 10:20:24 crc kubenswrapper[4962]: I0220 10:20:24.138856 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:20:24 crc kubenswrapper[4962]: E0220 10:20:24.139822 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:20:39 crc kubenswrapper[4962]: I0220 10:20:39.146825 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:20:39 crc kubenswrapper[4962]: E0220 10:20:39.150266 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:20:52 crc kubenswrapper[4962]: I0220 10:20:52.139007 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:20:52 crc kubenswrapper[4962]: E0220 10:20:52.141656 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:07 crc kubenswrapper[4962]: I0220 10:21:07.139500 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:07 crc kubenswrapper[4962]: E0220 10:21:07.140720 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.498227 4962 scope.go:117] "RemoveContainer" containerID="6f635f1f56319fca1af13c4d65bb4a7c7d012f95348309539e66bb9bc3885680" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.551317 4962 scope.go:117] "RemoveContainer" containerID="a45c4081d1cfd44304d7f3d8b40910079cb39e233843a73a0bea91a01d00d686" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.592562 4962 scope.go:117] "RemoveContainer" containerID="c4560e14774e3c9741c91f46ea630363e7cc5935a06c720a5d083bca786e716f" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.626742 4962 scope.go:117] "RemoveContainer" containerID="f9cb69ce2f5869e2d5aa8f13c96033f3ed4a62ca0344285f07875e14d0de4351" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.697725 4962 scope.go:117] "RemoveContainer" containerID="bfe2a2311075991b6e26f61913d5319a6a3da98a5127862535ec8779ac2e9fce" Feb 20 10:21:14 crc kubenswrapper[4962]: I0220 10:21:14.725902 4962 scope.go:117] "RemoveContainer" containerID="ce2c059ffddb8a4bd817e0bdef157eb8b02fa711cf3898a972dc3c9f08da8952" Feb 20 10:21:19 crc kubenswrapper[4962]: I0220 10:21:19.146865 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:19 crc kubenswrapper[4962]: E0220 10:21:19.149495 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:30 crc kubenswrapper[4962]: I0220 10:21:30.139431 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:30 crc kubenswrapper[4962]: E0220 10:21:30.140641 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:43 crc kubenswrapper[4962]: I0220 10:21:43.139535 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:43 crc kubenswrapper[4962]: E0220 10:21:43.142434 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:21:58 crc kubenswrapper[4962]: I0220 10:21:58.139170 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:21:58 crc kubenswrapper[4962]: E0220 10:21:58.139783 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:22:09 crc kubenswrapper[4962]: I0220 10:22:09.147309 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:22:09 crc kubenswrapper[4962]: E0220 10:22:09.148337 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:22:14 crc kubenswrapper[4962]: I0220 10:22:14.873505 4962 scope.go:117] "RemoveContainer" containerID="bc2106ad3cc4af20a5e2c1213babb01b766f8accacbdaf4870b68d6cbc722d49" Feb 20 10:22:14 crc kubenswrapper[4962]: I0220 10:22:14.909469 4962 scope.go:117] "RemoveContainer" containerID="d5939f243f85e996e6d1902bb72680f0a5c1df9ab42c709cd744434161fb2db0" Feb 20 10:22:14 crc kubenswrapper[4962]: I0220 10:22:14.938490 4962 scope.go:117] "RemoveContainer" containerID="e14d4499aad39130d8942043e6328de4fbc415b007670a0434fde9be884215b2" Feb 20 10:22:14 crc kubenswrapper[4962]: I0220 10:22:14.975867 4962 scope.go:117] "RemoveContainer" containerID="e1788ed30c723d96dcb6e0f9484b28a97145a65cc9e3bff73edd5bbbf2ff0b13" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.032705 4962 scope.go:117] "RemoveContainer" containerID="3cc79122882da35c12762f52d1de73bf1a9ef430f240e775b44faf92fe147dab" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.089532 4962 scope.go:117] "RemoveContainer" containerID="b8af66136e35fc73b3d51b7e67a7d05bebe4ecd8a5ad20c914388c6152b5d470" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.114317 4962 scope.go:117] "RemoveContainer" containerID="e3155d74dd6282e1fc794d27b2b712bbcb47529b5f2fbb3e8f768bc271110d45" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.136861 4962 scope.go:117] "RemoveContainer" containerID="793db344d89e9339466a1f19a2e137b204724f58c385b41b9c74536f0d99e12b" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.179166 4962 scope.go:117] "RemoveContainer" containerID="0c7306cb64431bbfbfccbec9d4784b736bd29c8703a60d357986ec36fd19a276" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.203077 4962 scope.go:117] "RemoveContainer" containerID="7230277cc1eb3909a3d3342c6f5ba88bcf14bbf39fe46da73616efba87702b09" Feb 20 10:22:15 crc kubenswrapper[4962]: I0220 10:22:15.229523 4962 scope.go:117] "RemoveContainer" containerID="28df8a32fe5a1bd334afa755bb83b0ac292979f42bd8a6975cdf978af2b8b6b7" Feb 20 10:22:24 crc kubenswrapper[4962]: I0220 10:22:24.138910 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:22:24 crc kubenswrapper[4962]: E0220 10:22:24.139773 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:22:39 crc kubenswrapper[4962]: I0220 10:22:39.146311 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:22:39 crc kubenswrapper[4962]: E0220 10:22:39.147417 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:22:51 crc kubenswrapper[4962]: I0220 10:22:51.139059 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:22:51 crc kubenswrapper[4962]: E0220 10:22:51.140119 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:05 crc kubenswrapper[4962]: I0220 10:23:05.139138 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:05 crc kubenswrapper[4962]: E0220 10:23:05.140313 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:15 crc kubenswrapper[4962]: I0220 10:23:15.403870 4962 scope.go:117] "RemoveContainer" containerID="1eb9947e80af1012b6145dccb54cd11c0689239b2a15c94816fdca73015d8cfe" Feb 20 10:23:15 crc kubenswrapper[4962]: I0220 10:23:15.464453 4962 scope.go:117] "RemoveContainer" containerID="c43d694b0ea8172a2db698ac63ac57a6cb364529c6c87ffb777fc946029b6b2f" Feb 20 10:23:16 crc kubenswrapper[4962]: I0220 10:23:16.139356 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:16 crc kubenswrapper[4962]: E0220 10:23:16.139781 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:27 crc kubenswrapper[4962]: I0220 10:23:27.139060 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:27 crc kubenswrapper[4962]: E0220 10:23:27.139709 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:41 crc kubenswrapper[4962]: I0220 10:23:41.140931 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:41 crc kubenswrapper[4962]: E0220 10:23:41.142617 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:23:52 crc kubenswrapper[4962]: I0220 10:23:52.139075 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:23:52 crc kubenswrapper[4962]: E0220 10:23:52.140216 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:03 crc kubenswrapper[4962]: I0220 10:24:03.139810 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:03 crc kubenswrapper[4962]: E0220 10:24:03.140917 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:15 crc kubenswrapper[4962]: I0220 10:24:15.139522 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:15 crc kubenswrapper[4962]: E0220 10:24:15.140692 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:15 crc kubenswrapper[4962]: I0220 10:24:15.557744 4962 scope.go:117] "RemoveContainer" containerID="d6bf8640027e8b75225f36e2b4a5d790818a0e4259c4c5012d627c79a493efb3" Feb 20 10:24:15 crc kubenswrapper[4962]: I0220 10:24:15.586825 4962 scope.go:117] "RemoveContainer" containerID="42f33c3ac4e84257c4f38d060186abe1300d7dfb20f8894c1b519bb38d1529c9" Feb 20 10:24:15 crc kubenswrapper[4962]: I0220 10:24:15.621416 4962 scope.go:117] "RemoveContainer" containerID="f9e4860c3043e0b48490e065c36e81c4bc365aa4ac0725e20676491c7054e577" Feb 20 10:24:26 crc kubenswrapper[4962]: I0220 10:24:26.139541 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:26 crc kubenswrapper[4962]: E0220 10:24:26.140664 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:38 crc kubenswrapper[4962]: I0220 10:24:38.139209 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:38 crc kubenswrapper[4962]: E0220 10:24:38.139962 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:24:50 crc kubenswrapper[4962]: I0220 10:24:50.140172 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:24:50 crc kubenswrapper[4962]: E0220 10:24:50.141502 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:25:04 crc kubenswrapper[4962]: I0220 10:25:04.139937 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:25:04 crc kubenswrapper[4962]: E0220 10:25:04.141113 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:25:15 crc kubenswrapper[4962]: I0220 10:25:15.138673 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:25:16 crc kubenswrapper[4962]: I0220 10:25:16.079333 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde"} Feb 20 10:27:41 crc kubenswrapper[4962]: I0220 10:27:41.508122 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:27:41 crc kubenswrapper[4962]: I0220 10:27:41.509153 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:28:11 crc kubenswrapper[4962]: I0220 10:28:11.508042 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:28:11 crc kubenswrapper[4962]: I0220 10:28:11.508991 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.065564 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066679 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066694 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066706 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="setup-container" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066712 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="setup-container" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066720 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066731 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-server" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066746 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066755 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066770 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066776 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066797 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066805 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066827 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="setup-container" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066834 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="setup-container" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066841 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerName="kube-state-metrics" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066848 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerName="kube-state-metrics" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066866 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066873 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066892 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="mysql-bootstrap" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066898 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="mysql-bootstrap" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066914 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066920 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-api" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066939 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-central-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066945 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-central-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066956 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066963 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066970 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerName="memcached" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066978 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerName="memcached" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.066991 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-reaper" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.066997 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-reaper" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067020 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerName="keystone-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067025 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerName="keystone-api" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067033 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="rsync" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067039 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="rsync" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067049 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067056 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067069 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067075 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067087 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="sg-core" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067093 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="sg-core" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067108 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067114 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067127 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067135 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067143 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-notification-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067150 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-notification-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067169 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067175 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067192 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067198 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067209 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067215 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067228 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067234 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067245 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067251 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067264 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067270 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067279 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-expirer" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067287 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-expirer" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067294 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067299 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067310 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server-init" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067316 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server-init" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067331 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="swift-recon-cron" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067337 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="swift-recon-cron" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067345 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067351 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067368 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067374 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067382 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067388 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-server" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067406 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="openstack-network-exporter" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067413 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="openstack-network-exporter" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067424 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="proxy-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067430 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="proxy-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067441 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067447 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-log" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067458 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067464 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067476 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="galera" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067482 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="galera" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067495 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067501 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-server" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067516 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067522 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067530 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067536 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067547 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067553 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: E0220 10:28:19.067564 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.067570 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068227 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068243 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068256 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068275 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="swift-recon-cron" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068286 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068302 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068310 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd02115-2eb9-4090-8225-108c3a8cad20" containerName="nova-scheduler-scheduler" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068325 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068333 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c1a487-1a74-4994-9b39-f05cbe0fa5c7" containerName="barbican-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068349 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8d652d-aea8-4a83-b33e-0d2522af0be8" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068361 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068373 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4a409a-4230-42ca-bfcc-f014064cbc6c" containerName="glance-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068381 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-log" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068392 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068400 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a77dd3-ef10-46a6-a00d-ab38af0d4338" containerName="rabbitmq" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068416 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-expirer" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068426 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ce3f9c-b8d2-4c53-a494-3aa01ec4f9b3" containerName="neutron-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068437 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovsdb-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068448 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e766bfd-869d-43ca-bf11-cf4ec9fa253a" containerName="galera" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068464 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8425d5-32be-4726-915a-3de5c70f0f62" containerName="ovs-vswitchd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068479 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-central-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068493 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="openstack-network-exporter" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068499 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068514 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce62af15-166f-4f74-a244-2de5147a4b2f" containerName="nova-cell1-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068527 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="container-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068541 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="815f0ef8-a30a-4467-bb56-ff8499a4be44" containerName="nova-cell0-conductor-conductor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068553 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068564 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="sg-core" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068575 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-auditor" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068598 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b22a9e86-ccdf-4505-8116-21b0230943fc" containerName="memcached" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068620 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-replicator" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068631 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-reaper" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068642 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffca43e-3e19-4430-8fe2-ca7cfe6229b0" containerName="kube-state-metrics" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068655 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="rsync" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068666 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="account-server" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068673 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d73a04-08b2-4944-861f-749a63c2565d" containerName="ovn-northd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068682 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="812fea74-e4e5-4550-8a20-8fe04752a016" containerName="mariadb-account-create-update" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068695 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="ceilometer-notification-agent" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068710 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="241dc417-3176-4051-ad4e-d98f4f66ddc2" containerName="nova-api-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068718 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fb3b99-0e02-4c5c-9704-884ea3f0605d" containerName="object-updater" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068730 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d203fc44-5252-4dd2-98ae-66f9c139b5f5" containerName="keystone-api" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.068743 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae69c76-754d-4125-a405-23a3938e90a9" containerName="proxy-httpd" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.072342 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.091549 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.170211 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.170312 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.170441 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272067 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272174 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272223 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272806 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.272984 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.299736 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") pod \"redhat-operators-twkms\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.412219 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:19 crc kubenswrapper[4962]: I0220 10:28:19.864336 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:20 crc kubenswrapper[4962]: I0220 10:28:20.769766 4962 generic.go:334] "Generic (PLEG): container finished" podID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerID="d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87" exitCode=0 Feb 20 10:28:20 crc kubenswrapper[4962]: I0220 10:28:20.769910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerDied","Data":"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87"} Feb 20 10:28:20 crc kubenswrapper[4962]: I0220 10:28:20.770447 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerStarted","Data":"5fe37c57c2b2b7862de3baa3e03886ddbe7805805cfcf43d366e424522152507"} Feb 20 10:28:20 crc kubenswrapper[4962]: I0220 10:28:20.776493 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:28:21 crc kubenswrapper[4962]: I0220 10:28:21.783985 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerStarted","Data":"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e"} Feb 20 10:28:22 crc kubenswrapper[4962]: I0220 10:28:22.797892 4962 generic.go:334] "Generic (PLEG): container finished" podID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerID="100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e" exitCode=0 Feb 20 10:28:22 crc kubenswrapper[4962]: I0220 10:28:22.797970 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerDied","Data":"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e"} Feb 20 10:28:23 crc kubenswrapper[4962]: I0220 10:28:23.814117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerStarted","Data":"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800"} Feb 20 10:28:23 crc kubenswrapper[4962]: I0220 10:28:23.844800 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-twkms" podStartSLOduration=2.340427011 podStartE2EDuration="4.844775004s" podCreationTimestamp="2026-02-20 10:28:19 +0000 UTC" firstStartedPulling="2026-02-20 10:28:20.776195936 +0000 UTC m=+1992.358667792" lastFinishedPulling="2026-02-20 10:28:23.280543909 +0000 UTC m=+1994.863015785" observedRunningTime="2026-02-20 10:28:23.836912299 +0000 UTC m=+1995.419384185" watchObservedRunningTime="2026-02-20 10:28:23.844775004 +0000 UTC m=+1995.427246880" Feb 20 10:28:29 crc kubenswrapper[4962]: I0220 10:28:29.412397 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:29 crc kubenswrapper[4962]: I0220 10:28:29.412951 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:30 crc kubenswrapper[4962]: I0220 10:28:30.479680 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-twkms" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" probeResult="failure" output=< Feb 20 10:28:30 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:28:30 crc kubenswrapper[4962]: > Feb 20 10:28:39 crc kubenswrapper[4962]: I0220 10:28:39.498822 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:39 crc kubenswrapper[4962]: I0220 10:28:39.584206 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:39 crc kubenswrapper[4962]: I0220 10:28:39.760941 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:40 crc kubenswrapper[4962]: I0220 10:28:40.986751 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-twkms" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" containerID="cri-o://5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" gracePeriod=2 Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.484746 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.508383 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.508459 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.508514 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.509270 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.509342 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde" gracePeriod=600 Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.556423 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") pod \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.556525 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") pod \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.556587 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") pod \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\" (UID: \"319aafc1-a34d-45d0-9b00-67b3c80f3f04\") " Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.557487 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities" (OuterVolumeSpecName: "utilities") pod "319aafc1-a34d-45d0-9b00-67b3c80f3f04" (UID: "319aafc1-a34d-45d0-9b00-67b3c80f3f04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.564112 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd" (OuterVolumeSpecName: "kube-api-access-dpzrd") pod "319aafc1-a34d-45d0-9b00-67b3c80f3f04" (UID: "319aafc1-a34d-45d0-9b00-67b3c80f3f04"). InnerVolumeSpecName "kube-api-access-dpzrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.658794 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.658830 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpzrd\" (UniqueName: \"kubernetes.io/projected/319aafc1-a34d-45d0-9b00-67b3c80f3f04-kube-api-access-dpzrd\") on node \"crc\" DevicePath \"\"" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.695034 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "319aafc1-a34d-45d0-9b00-67b3c80f3f04" (UID: "319aafc1-a34d-45d0-9b00-67b3c80f3f04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:28:41 crc kubenswrapper[4962]: I0220 10:28:41.759949 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319aafc1-a34d-45d0-9b00-67b3c80f3f04-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001521 4962 generic.go:334] "Generic (PLEG): container finished" podID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerID="5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" exitCode=0 Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001644 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-twkms" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerDied","Data":"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800"} Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001777 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-twkms" event={"ID":"319aafc1-a34d-45d0-9b00-67b3c80f3f04","Type":"ContainerDied","Data":"5fe37c57c2b2b7862de3baa3e03886ddbe7805805cfcf43d366e424522152507"} Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.001817 4962 scope.go:117] "RemoveContainer" containerID="5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.015225 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde" exitCode=0 Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.015324 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde"} Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.015389 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e"} Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.050034 4962 scope.go:117] "RemoveContainer" containerID="100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.073669 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.081094 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-twkms"] Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.097925 4962 scope.go:117] "RemoveContainer" containerID="d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.131891 4962 scope.go:117] "RemoveContainer" containerID="5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" Feb 20 10:28:42 crc kubenswrapper[4962]: E0220 10:28:42.132526 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800\": container with ID starting with 5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800 not found: ID does not exist" containerID="5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.132569 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800"} err="failed to get container status \"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800\": rpc error: code = NotFound desc = could not find container \"5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800\": container with ID starting with 5b4e079ee955ccb06e5b852ef84d03dff18daa42ab6870c59fb2c8f5b8b81800 not found: ID does not exist" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.132617 4962 scope.go:117] "RemoveContainer" containerID="100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e" Feb 20 10:28:42 crc kubenswrapper[4962]: E0220 10:28:42.133120 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e\": container with ID starting with 100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e not found: ID does not exist" containerID="100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.133181 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e"} err="failed to get container status \"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e\": rpc error: code = NotFound desc = could not find container \"100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e\": container with ID starting with 100c9796d2be64f60a2c1f1d82d57b779258944ef86aafff3ac54be5323fc38e not found: ID does not exist" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.133225 4962 scope.go:117] "RemoveContainer" containerID="d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87" Feb 20 10:28:42 crc kubenswrapper[4962]: E0220 10:28:42.133618 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87\": container with ID starting with d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87 not found: ID does not exist" containerID="d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.133655 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87"} err="failed to get container status \"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87\": rpc error: code = NotFound desc = could not find container \"d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87\": container with ID starting with d6d52f8c1e877a34f2e6f955f1697d9dcfed3a3ee1234f298f77bd0e79de9a87 not found: ID does not exist" Feb 20 10:28:42 crc kubenswrapper[4962]: I0220 10:28:42.133682 4962 scope.go:117] "RemoveContainer" containerID="571edd66d03ca20f4984454288fe0f239b478c2bf695a245264962e04c56853e" Feb 20 10:28:43 crc kubenswrapper[4962]: I0220 10:28:43.159200 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" path="/var/lib/kubelet/pods/319aafc1-a34d-45d0-9b00-67b3c80f3f04/volumes" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.791863 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:28:57 crc kubenswrapper[4962]: E0220 10:28:57.792944 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="extract-utilities" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.792967 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="extract-utilities" Feb 20 10:28:57 crc kubenswrapper[4962]: E0220 10:28:57.793010 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="extract-content" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.793023 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="extract-content" Feb 20 10:28:57 crc kubenswrapper[4962]: E0220 10:28:57.793058 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.793071 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.793320 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="319aafc1-a34d-45d0-9b00-67b3c80f3f04" containerName="registry-server" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.794957 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.810110 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.816978 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.817170 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.817292 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.921171 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.921245 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.921290 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.921946 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.922556 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:57 crc kubenswrapper[4962]: I0220 10:28:57.960552 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") pod \"certified-operators-k5bgh\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:58 crc kubenswrapper[4962]: I0220 10:28:58.142073 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:28:58 crc kubenswrapper[4962]: I0220 10:28:58.672233 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.221093 4962 generic.go:334] "Generic (PLEG): container finished" podID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerID="ba830112dfbb46386f795bf9c8f766dbff593db0dd17e9fef4db3b3bebe42597" exitCode=0 Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.221160 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerDied","Data":"ba830112dfbb46386f795bf9c8f766dbff593db0dd17e9fef4db3b3bebe42597"} Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.221556 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerStarted","Data":"de425b43216f19572591d858be95124b141a268bfc57571deba58215fe7f8d3f"} Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.783804 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.786479 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.823397 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.955300 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.955357 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:28:59 crc kubenswrapper[4962]: I0220 10:28:59.955528 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.056663 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.056837 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.056882 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.057195 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.057350 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.082778 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") pod \"community-operators-f9bxk\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.115704 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.239094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerStarted","Data":"8292db70a0fb7ff1771aff36e2f05b918952395a507e66a9122380a220a3b5b5"} Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.401285 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.776428 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.778970 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.791471 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.970811 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.970951 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:00 crc kubenswrapper[4962]: I0220 10:29:00.971008 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.072491 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.072563 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.072709 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.073133 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.073203 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.094864 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") pod \"redhat-marketplace-72m89\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.100083 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.256679 4962 generic.go:334] "Generic (PLEG): container finished" podID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerID="f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9" exitCode=0 Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.256788 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerDied","Data":"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9"} Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.256823 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerStarted","Data":"19dcaa5f1fe9bc0e1a14e304278ba9fabd067499af81e2b79740a27bccc7d78d"} Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.260964 4962 generic.go:334] "Generic (PLEG): container finished" podID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerID="8292db70a0fb7ff1771aff36e2f05b918952395a507e66a9122380a220a3b5b5" exitCode=0 Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.261031 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerDied","Data":"8292db70a0fb7ff1771aff36e2f05b918952395a507e66a9122380a220a3b5b5"} Feb 20 10:29:01 crc kubenswrapper[4962]: I0220 10:29:01.368693 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.271020 4962 generic.go:334] "Generic (PLEG): container finished" podID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerID="18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c" exitCode=0 Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.271254 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerDied","Data":"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c"} Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.276507 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerStarted","Data":"1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1"} Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.278853 4962 generic.go:334] "Generic (PLEG): container finished" podID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerID="8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0" exitCode=0 Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.278922 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerDied","Data":"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0"} Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.278967 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerStarted","Data":"cb0f58c8e15de5b78552c1a1def14285bd1745ff1b57dce0feda8de2b6f2a54a"} Feb 20 10:29:02 crc kubenswrapper[4962]: I0220 10:29:02.375358 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k5bgh" podStartSLOduration=2.941778397 podStartE2EDuration="5.375326751s" podCreationTimestamp="2026-02-20 10:28:57 +0000 UTC" firstStartedPulling="2026-02-20 10:28:59.223112417 +0000 UTC m=+2030.805584273" lastFinishedPulling="2026-02-20 10:29:01.656660741 +0000 UTC m=+2033.239132627" observedRunningTime="2026-02-20 10:29:02.368209209 +0000 UTC m=+2033.950681105" watchObservedRunningTime="2026-02-20 10:29:02.375326751 +0000 UTC m=+2033.957798637" Feb 20 10:29:03 crc kubenswrapper[4962]: I0220 10:29:03.290894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerStarted","Data":"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187"} Feb 20 10:29:03 crc kubenswrapper[4962]: I0220 10:29:03.293586 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerStarted","Data":"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e"} Feb 20 10:29:03 crc kubenswrapper[4962]: I0220 10:29:03.307866 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9bxk" podStartSLOduration=2.870389904 podStartE2EDuration="4.307844367s" podCreationTimestamp="2026-02-20 10:28:59 +0000 UTC" firstStartedPulling="2026-02-20 10:29:01.259473105 +0000 UTC m=+2032.841944951" lastFinishedPulling="2026-02-20 10:29:02.696927538 +0000 UTC m=+2034.279399414" observedRunningTime="2026-02-20 10:29:03.304294437 +0000 UTC m=+2034.886766293" watchObservedRunningTime="2026-02-20 10:29:03.307844367 +0000 UTC m=+2034.890316223" Feb 20 10:29:04 crc kubenswrapper[4962]: I0220 10:29:04.305654 4962 generic.go:334] "Generic (PLEG): container finished" podID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerID="2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e" exitCode=0 Feb 20 10:29:04 crc kubenswrapper[4962]: I0220 10:29:04.305768 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerDied","Data":"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e"} Feb 20 10:29:05 crc kubenswrapper[4962]: I0220 10:29:05.316826 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerStarted","Data":"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec"} Feb 20 10:29:05 crc kubenswrapper[4962]: I0220 10:29:05.351718 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-72m89" podStartSLOduration=2.914972255 podStartE2EDuration="5.351694609s" podCreationTimestamp="2026-02-20 10:29:00 +0000 UTC" firstStartedPulling="2026-02-20 10:29:02.281989309 +0000 UTC m=+2033.864461195" lastFinishedPulling="2026-02-20 10:29:04.718711703 +0000 UTC m=+2036.301183549" observedRunningTime="2026-02-20 10:29:05.34690308 +0000 UTC m=+2036.929374936" watchObservedRunningTime="2026-02-20 10:29:05.351694609 +0000 UTC m=+2036.934166465" Feb 20 10:29:08 crc kubenswrapper[4962]: I0220 10:29:08.143074 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:08 crc kubenswrapper[4962]: I0220 10:29:08.143559 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:08 crc kubenswrapper[4962]: I0220 10:29:08.223710 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:08 crc kubenswrapper[4962]: I0220 10:29:08.421814 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:10 crc kubenswrapper[4962]: I0220 10:29:10.116873 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:10 crc kubenswrapper[4962]: I0220 10:29:10.117322 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:10 crc kubenswrapper[4962]: I0220 10:29:10.197053 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:10 crc kubenswrapper[4962]: I0220 10:29:10.438093 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:11 crc kubenswrapper[4962]: I0220 10:29:11.100660 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:11 crc kubenswrapper[4962]: I0220 10:29:11.100739 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:11 crc kubenswrapper[4962]: I0220 10:29:11.174999 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:11 crc kubenswrapper[4962]: I0220 10:29:11.416065 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:12 crc kubenswrapper[4962]: I0220 10:29:12.973913 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:29:12 crc kubenswrapper[4962]: I0220 10:29:12.974271 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k5bgh" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="registry-server" containerID="cri-o://1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1" gracePeriod=2 Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.398520 4962 generic.go:334] "Generic (PLEG): container finished" podID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerID="1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1" exitCode=0 Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.398573 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerDied","Data":"1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1"} Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.523756 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.578931 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.579391 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-72m89" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="registry-server" containerID="cri-o://ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" gracePeriod=2 Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.717168 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") pod \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.717435 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") pod \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.717487 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") pod \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\" (UID: \"fe8d972b-b855-4f15-b7ad-10530b2b31ed\") " Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.718154 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities" (OuterVolumeSpecName: "utilities") pod "fe8d972b-b855-4f15-b7ad-10530b2b31ed" (UID: "fe8d972b-b855-4f15-b7ad-10530b2b31ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.726206 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p" (OuterVolumeSpecName: "kube-api-access-9279p") pod "fe8d972b-b855-4f15-b7ad-10530b2b31ed" (UID: "fe8d972b-b855-4f15-b7ad-10530b2b31ed"). InnerVolumeSpecName "kube-api-access-9279p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.773772 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe8d972b-b855-4f15-b7ad-10530b2b31ed" (UID: "fe8d972b-b855-4f15-b7ad-10530b2b31ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.819349 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9279p\" (UniqueName: \"kubernetes.io/projected/fe8d972b-b855-4f15-b7ad-10530b2b31ed-kube-api-access-9279p\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.819379 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:13 crc kubenswrapper[4962]: I0220 10:29:13.819388 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe8d972b-b855-4f15-b7ad-10530b2b31ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.050979 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.224424 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") pod \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.224523 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") pod \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.224920 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") pod \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\" (UID: \"8cbf1964-98e2-467d-ba01-7724a5f1a71c\") " Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.225791 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities" (OuterVolumeSpecName: "utilities") pod "8cbf1964-98e2-467d-ba01-7724a5f1a71c" (UID: "8cbf1964-98e2-467d-ba01-7724a5f1a71c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.231106 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6" (OuterVolumeSpecName: "kube-api-access-nk7z6") pod "8cbf1964-98e2-467d-ba01-7724a5f1a71c" (UID: "8cbf1964-98e2-467d-ba01-7724a5f1a71c"). InnerVolumeSpecName "kube-api-access-nk7z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.273646 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cbf1964-98e2-467d-ba01-7724a5f1a71c" (UID: "8cbf1964-98e2-467d-ba01-7724a5f1a71c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.327181 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.327225 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbf1964-98e2-467d-ba01-7724a5f1a71c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.327246 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk7z6\" (UniqueName: \"kubernetes.io/projected/8cbf1964-98e2-467d-ba01-7724a5f1a71c-kube-api-access-nk7z6\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.412721 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k5bgh" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.412723 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k5bgh" event={"ID":"fe8d972b-b855-4f15-b7ad-10530b2b31ed","Type":"ContainerDied","Data":"de425b43216f19572591d858be95124b141a268bfc57571deba58215fe7f8d3f"} Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.412951 4962 scope.go:117] "RemoveContainer" containerID="1aaf871d746903f293b5cfb06ce4404d68e4327388a3f7920a13533a4b903fd1" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.417859 4962 generic.go:334] "Generic (PLEG): container finished" podID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerID="ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" exitCode=0 Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.417902 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerDied","Data":"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec"} Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.417958 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-72m89" event={"ID":"8cbf1964-98e2-467d-ba01-7724a5f1a71c","Type":"ContainerDied","Data":"cb0f58c8e15de5b78552c1a1def14285bd1745ff1b57dce0feda8de2b6f2a54a"} Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.418244 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-72m89" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.451989 4962 scope.go:117] "RemoveContainer" containerID="8292db70a0fb7ff1771aff36e2f05b918952395a507e66a9122380a220a3b5b5" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.471384 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.483728 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k5bgh"] Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.492976 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.495067 4962 scope.go:117] "RemoveContainer" containerID="ba830112dfbb46386f795bf9c8f766dbff593db0dd17e9fef4db3b3bebe42597" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.499050 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-72m89"] Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.530369 4962 scope.go:117] "RemoveContainer" containerID="ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.567237 4962 scope.go:117] "RemoveContainer" containerID="2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.591581 4962 scope.go:117] "RemoveContainer" containerID="8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.638274 4962 scope.go:117] "RemoveContainer" containerID="ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" Feb 20 10:29:14 crc kubenswrapper[4962]: E0220 10:29:14.638876 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec\": container with ID starting with ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec not found: ID does not exist" containerID="ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.638921 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec"} err="failed to get container status \"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec\": rpc error: code = NotFound desc = could not find container \"ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec\": container with ID starting with ce5c9d1e37fbf38d5c95312bfd939b8a4d56294d04e93ed68e4ea416cc4ca0ec not found: ID does not exist" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.638951 4962 scope.go:117] "RemoveContainer" containerID="2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e" Feb 20 10:29:14 crc kubenswrapper[4962]: E0220 10:29:14.639500 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e\": container with ID starting with 2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e not found: ID does not exist" containerID="2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.639561 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e"} err="failed to get container status \"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e\": rpc error: code = NotFound desc = could not find container \"2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e\": container with ID starting with 2058209fd20e1c69394beb34e568f7d56883a5c3cbbf4697f7cd16797e6d872e not found: ID does not exist" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.639635 4962 scope.go:117] "RemoveContainer" containerID="8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0" Feb 20 10:29:14 crc kubenswrapper[4962]: E0220 10:29:14.640169 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0\": container with ID starting with 8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0 not found: ID does not exist" containerID="8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0" Feb 20 10:29:14 crc kubenswrapper[4962]: I0220 10:29:14.640347 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0"} err="failed to get container status \"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0\": rpc error: code = NotFound desc = could not find container \"8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0\": container with ID starting with 8d7f890d054f6d5409cde2d5c9434e40eac49af437c61756664a066f04e14ba0 not found: ID does not exist" Feb 20 10:29:15 crc kubenswrapper[4962]: I0220 10:29:15.153492 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" path="/var/lib/kubelet/pods/8cbf1964-98e2-467d-ba01-7724a5f1a71c/volumes" Feb 20 10:29:15 crc kubenswrapper[4962]: I0220 10:29:15.156658 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" path="/var/lib/kubelet/pods/fe8d972b-b855-4f15-b7ad-10530b2b31ed/volumes" Feb 20 10:29:17 crc kubenswrapper[4962]: I0220 10:29:17.582382 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:29:17 crc kubenswrapper[4962]: I0220 10:29:17.583248 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f9bxk" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="registry-server" containerID="cri-o://2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" gracePeriod=2 Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.157895 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.295550 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") pod \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.295798 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") pod \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.295899 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") pod \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\" (UID: \"128802f0-1918-4aaf-bff0-fd43fe96a1ac\") " Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.297004 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities" (OuterVolumeSpecName: "utilities") pod "128802f0-1918-4aaf-bff0-fd43fe96a1ac" (UID: "128802f0-1918-4aaf-bff0-fd43fe96a1ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.304923 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6" (OuterVolumeSpecName: "kube-api-access-nhpt6") pod "128802f0-1918-4aaf-bff0-fd43fe96a1ac" (UID: "128802f0-1918-4aaf-bff0-fd43fe96a1ac"). InnerVolumeSpecName "kube-api-access-nhpt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.380657 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "128802f0-1918-4aaf-bff0-fd43fe96a1ac" (UID: "128802f0-1918-4aaf-bff0-fd43fe96a1ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.397772 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.397806 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhpt6\" (UniqueName: \"kubernetes.io/projected/128802f0-1918-4aaf-bff0-fd43fe96a1ac-kube-api-access-nhpt6\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.397822 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128802f0-1918-4aaf-bff0-fd43fe96a1ac-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.463812 4962 generic.go:334] "Generic (PLEG): container finished" podID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerID="2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" exitCode=0 Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.463871 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerDied","Data":"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187"} Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.463911 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9bxk" event={"ID":"128802f0-1918-4aaf-bff0-fd43fe96a1ac","Type":"ContainerDied","Data":"19dcaa5f1fe9bc0e1a14e304278ba9fabd067499af81e2b79740a27bccc7d78d"} Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.463945 4962 scope.go:117] "RemoveContainer" containerID="2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.464103 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9bxk" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.505926 4962 scope.go:117] "RemoveContainer" containerID="18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.518710 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.528447 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f9bxk"] Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.541683 4962 scope.go:117] "RemoveContainer" containerID="f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.578969 4962 scope.go:117] "RemoveContainer" containerID="2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" Feb 20 10:29:18 crc kubenswrapper[4962]: E0220 10:29:18.579624 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187\": container with ID starting with 2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187 not found: ID does not exist" containerID="2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.579724 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187"} err="failed to get container status \"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187\": rpc error: code = NotFound desc = could not find container \"2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187\": container with ID starting with 2b7f5af61d53fba383d87e09cd361faae473332a317e69b9acfd272d022b4187 not found: ID does not exist" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.579776 4962 scope.go:117] "RemoveContainer" containerID="18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c" Feb 20 10:29:18 crc kubenswrapper[4962]: E0220 10:29:18.580907 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c\": container with ID starting with 18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c not found: ID does not exist" containerID="18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.581032 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c"} err="failed to get container status \"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c\": rpc error: code = NotFound desc = could not find container \"18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c\": container with ID starting with 18d5e842781bf2bfe8a591276d9dd3e8ec903d1098b4619e5823ff722db4511c not found: ID does not exist" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.581079 4962 scope.go:117] "RemoveContainer" containerID="f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9" Feb 20 10:29:18 crc kubenswrapper[4962]: E0220 10:29:18.581879 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9\": container with ID starting with f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9 not found: ID does not exist" containerID="f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9" Feb 20 10:29:18 crc kubenswrapper[4962]: I0220 10:29:18.581923 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9"} err="failed to get container status \"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9\": rpc error: code = NotFound desc = could not find container \"f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9\": container with ID starting with f12f6f4d3aa0a970eb509a848c17fccf8c8c17393bddb4ecb2de0621d5401ba9 not found: ID does not exist" Feb 20 10:29:19 crc kubenswrapper[4962]: I0220 10:29:19.150410 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" path="/var/lib/kubelet/pods/128802f0-1918-4aaf-bff0-fd43fe96a1ac/volumes" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.163903 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167264 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167298 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167328 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167346 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167385 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167407 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167429 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167446 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167500 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167518 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="extract-utilities" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167541 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167557 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167582 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167631 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167662 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167679 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="extract-content" Feb 20 10:30:00 crc kubenswrapper[4962]: E0220 10:30:00.167720 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.167737 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.168062 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="128802f0-1918-4aaf-bff0-fd43fe96a1ac" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.168095 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8d972b-b855-4f15-b7ad-10530b2b31ed" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.168116 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbf1964-98e2-467d-ba01-7724a5f1a71c" containerName="registry-server" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.169125 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.174093 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.177792 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.178666 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.276274 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.276769 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.276865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.378650 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.378782 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.378881 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.380349 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.392315 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.402568 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") pod \"collect-profiles-29526390-2v7sg\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.503984 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:00 crc kubenswrapper[4962]: I0220 10:30:00.994450 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 10:30:01 crc kubenswrapper[4962]: I0220 10:30:01.864870 4962 generic.go:334] "Generic (PLEG): container finished" podID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" containerID="672fb8e4a3790f1f70ac1c9ed16383d55019a5b81bdd7e7049f12caa51ab0535" exitCode=0 Feb 20 10:30:01 crc kubenswrapper[4962]: I0220 10:30:01.864971 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" event={"ID":"6a57c99f-e682-43fc-85be-d6ca9b32dd2e","Type":"ContainerDied","Data":"672fb8e4a3790f1f70ac1c9ed16383d55019a5b81bdd7e7049f12caa51ab0535"} Feb 20 10:30:01 crc kubenswrapper[4962]: I0220 10:30:01.865222 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" event={"ID":"6a57c99f-e682-43fc-85be-d6ca9b32dd2e","Type":"ContainerStarted","Data":"98d481bd65aecc0a3a37c338808380116b6fcb68eddb9e99a185f9fe8e8723cd"} Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.296717 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.427252 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") pod \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.427388 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") pod \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.427533 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") pod \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\" (UID: \"6a57c99f-e682-43fc-85be-d6ca9b32dd2e\") " Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.428434 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume" (OuterVolumeSpecName: "config-volume") pod "6a57c99f-e682-43fc-85be-d6ca9b32dd2e" (UID: "6a57c99f-e682-43fc-85be-d6ca9b32dd2e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.432652 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6a57c99f-e682-43fc-85be-d6ca9b32dd2e" (UID: "6a57c99f-e682-43fc-85be-d6ca9b32dd2e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.434557 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89" (OuterVolumeSpecName: "kube-api-access-rxz89") pod "6a57c99f-e682-43fc-85be-d6ca9b32dd2e" (UID: "6a57c99f-e682-43fc-85be-d6ca9b32dd2e"). InnerVolumeSpecName "kube-api-access-rxz89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.529662 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.529697 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.529711 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxz89\" (UniqueName: \"kubernetes.io/projected/6a57c99f-e682-43fc-85be-d6ca9b32dd2e-kube-api-access-rxz89\") on node \"crc\" DevicePath \"\"" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.887647 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" event={"ID":"6a57c99f-e682-43fc-85be-d6ca9b32dd2e","Type":"ContainerDied","Data":"98d481bd65aecc0a3a37c338808380116b6fcb68eddb9e99a185f9fe8e8723cd"} Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.887709 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d481bd65aecc0a3a37c338808380116b6fcb68eddb9e99a185f9fe8e8723cd" Feb 20 10:30:03 crc kubenswrapper[4962]: I0220 10:30:03.887737 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg" Feb 20 10:30:04 crc kubenswrapper[4962]: I0220 10:30:04.401068 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 10:30:04 crc kubenswrapper[4962]: I0220 10:30:04.411313 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-4v6dw"] Feb 20 10:30:05 crc kubenswrapper[4962]: I0220 10:30:05.156655 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3652dbd-dae4-462b-be88-b8a782de8a1c" path="/var/lib/kubelet/pods/a3652dbd-dae4-462b-be88-b8a782de8a1c/volumes" Feb 20 10:30:15 crc kubenswrapper[4962]: I0220 10:30:15.880676 4962 scope.go:117] "RemoveContainer" containerID="e8531bc42f535f5fdb200b255f9b4197b26b17fb00311859ea4571ac343f8767" Feb 20 10:30:41 crc kubenswrapper[4962]: I0220 10:30:41.507855 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:30:41 crc kubenswrapper[4962]: I0220 10:30:41.508739 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:31:11 crc kubenswrapper[4962]: I0220 10:31:11.508078 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:31:11 crc kubenswrapper[4962]: I0220 10:31:11.508758 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.508112 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.508828 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.508919 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.509887 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.509990 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" gracePeriod=600 Feb 20 10:31:41 crc kubenswrapper[4962]: E0220 10:31:41.641421 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.792759 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" exitCode=0 Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.792783 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e"} Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.792947 4962 scope.go:117] "RemoveContainer" containerID="c5a3ed5d43365534c80fd6638118a5ca99f999ebea8670342afd3d7c63212fde" Feb 20 10:31:41 crc kubenswrapper[4962]: I0220 10:31:41.794536 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:31:41 crc kubenswrapper[4962]: E0220 10:31:41.795776 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:31:53 crc kubenswrapper[4962]: I0220 10:31:53.139088 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:31:53 crc kubenswrapper[4962]: E0220 10:31:53.140143 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:06 crc kubenswrapper[4962]: I0220 10:32:06.139035 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:06 crc kubenswrapper[4962]: E0220 10:32:06.140247 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:18 crc kubenswrapper[4962]: I0220 10:32:18.138881 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:18 crc kubenswrapper[4962]: E0220 10:32:18.140219 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:30 crc kubenswrapper[4962]: I0220 10:32:30.141023 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:30 crc kubenswrapper[4962]: E0220 10:32:30.141907 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:43 crc kubenswrapper[4962]: I0220 10:32:43.139012 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:43 crc kubenswrapper[4962]: E0220 10:32:43.140293 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:32:58 crc kubenswrapper[4962]: I0220 10:32:58.138994 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:32:58 crc kubenswrapper[4962]: E0220 10:32:58.142314 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:33:13 crc kubenswrapper[4962]: I0220 10:33:13.138907 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:33:13 crc kubenswrapper[4962]: E0220 10:33:13.139510 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:33:28 crc kubenswrapper[4962]: I0220 10:33:28.139223 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:33:28 crc kubenswrapper[4962]: E0220 10:33:28.140351 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:33:39 crc kubenswrapper[4962]: I0220 10:33:39.147952 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:33:39 crc kubenswrapper[4962]: E0220 10:33:39.149191 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:33:53 crc kubenswrapper[4962]: I0220 10:33:53.139175 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:33:53 crc kubenswrapper[4962]: E0220 10:33:53.140509 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:07 crc kubenswrapper[4962]: I0220 10:34:07.139640 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:07 crc kubenswrapper[4962]: E0220 10:34:07.140620 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:20 crc kubenswrapper[4962]: I0220 10:34:20.138944 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:20 crc kubenswrapper[4962]: E0220 10:34:20.139944 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:32 crc kubenswrapper[4962]: I0220 10:34:32.139517 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:32 crc kubenswrapper[4962]: E0220 10:34:32.140847 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:43 crc kubenswrapper[4962]: I0220 10:34:43.139574 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:43 crc kubenswrapper[4962]: E0220 10:34:43.140861 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:34:57 crc kubenswrapper[4962]: I0220 10:34:57.139662 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:34:57 crc kubenswrapper[4962]: E0220 10:34:57.141151 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:35:10 crc kubenswrapper[4962]: I0220 10:35:10.139956 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:35:10 crc kubenswrapper[4962]: E0220 10:35:10.141277 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:35:25 crc kubenswrapper[4962]: I0220 10:35:25.139517 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:35:25 crc kubenswrapper[4962]: E0220 10:35:25.140718 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:35:37 crc kubenswrapper[4962]: I0220 10:35:37.141912 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:35:37 crc kubenswrapper[4962]: E0220 10:35:37.143024 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:35:49 crc kubenswrapper[4962]: I0220 10:35:49.146211 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:35:49 crc kubenswrapper[4962]: E0220 10:35:49.146945 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:36:03 crc kubenswrapper[4962]: I0220 10:36:03.140042 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:36:03 crc kubenswrapper[4962]: E0220 10:36:03.141153 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:36:18 crc kubenswrapper[4962]: I0220 10:36:18.139502 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:36:18 crc kubenswrapper[4962]: E0220 10:36:18.140470 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:36:31 crc kubenswrapper[4962]: I0220 10:36:31.140200 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:36:31 crc kubenswrapper[4962]: E0220 10:36:31.141559 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:36:44 crc kubenswrapper[4962]: I0220 10:36:44.138999 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:36:44 crc kubenswrapper[4962]: I0220 10:36:44.616481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f"} Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.403409 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:38:55 crc kubenswrapper[4962]: E0220 10:38:55.404434 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" containerName="collect-profiles" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.404457 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" containerName="collect-profiles" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.404649 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" containerName="collect-profiles" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.405929 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.410553 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.410958 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.411106 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.418441 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.511857 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.511917 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.511966 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.512438 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.512522 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.551432 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") pod \"redhat-operators-nnkmg\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:55 crc kubenswrapper[4962]: I0220 10:38:55.735369 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.276248 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.791411 4962 generic.go:334] "Generic (PLEG): container finished" podID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerID="ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82" exitCode=0 Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.791465 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerDied","Data":"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82"} Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.791532 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerStarted","Data":"b1a939161b7b0cf6af29a1b96d2fbba2e4c10ba8217fb1c3bc0cdfa63f4d008d"} Feb 20 10:38:56 crc kubenswrapper[4962]: I0220 10:38:56.793366 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:38:57 crc kubenswrapper[4962]: I0220 10:38:57.803435 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerStarted","Data":"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d"} Feb 20 10:38:58 crc kubenswrapper[4962]: I0220 10:38:58.813655 4962 generic.go:334] "Generic (PLEG): container finished" podID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerID="8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d" exitCode=0 Feb 20 10:38:58 crc kubenswrapper[4962]: I0220 10:38:58.813744 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerDied","Data":"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d"} Feb 20 10:38:59 crc kubenswrapper[4962]: I0220 10:38:59.823907 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerStarted","Data":"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c"} Feb 20 10:38:59 crc kubenswrapper[4962]: I0220 10:38:59.855365 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nnkmg" podStartSLOduration=2.417982249 podStartE2EDuration="4.855348187s" podCreationTimestamp="2026-02-20 10:38:55 +0000 UTC" firstStartedPulling="2026-02-20 10:38:56.793174521 +0000 UTC m=+2628.375646367" lastFinishedPulling="2026-02-20 10:38:59.230540449 +0000 UTC m=+2630.813012305" observedRunningTime="2026-02-20 10:38:59.850884109 +0000 UTC m=+2631.433355995" watchObservedRunningTime="2026-02-20 10:38:59.855348187 +0000 UTC m=+2631.437820043" Feb 20 10:39:05 crc kubenswrapper[4962]: I0220 10:39:05.736324 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:05 crc kubenswrapper[4962]: I0220 10:39:05.737459 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:06 crc kubenswrapper[4962]: I0220 10:39:06.808224 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nnkmg" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" probeResult="failure" output=< Feb 20 10:39:06 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:39:06 crc kubenswrapper[4962]: > Feb 20 10:39:11 crc kubenswrapper[4962]: I0220 10:39:11.508196 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:39:11 crc kubenswrapper[4962]: I0220 10:39:11.508627 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:39:15 crc kubenswrapper[4962]: I0220 10:39:15.806075 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:15 crc kubenswrapper[4962]: I0220 10:39:15.869162 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:16 crc kubenswrapper[4962]: I0220 10:39:16.058278 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:39:16 crc kubenswrapper[4962]: I0220 10:39:16.983265 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nnkmg" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" containerID="cri-o://f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" gracePeriod=2 Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.473498 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.486708 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") pod \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.486924 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") pod \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.486951 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") pod \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\" (UID: \"a186f0cc-4ee5-4c45-9bf0-49f496ed709b\") " Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.490268 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities" (OuterVolumeSpecName: "utilities") pod "a186f0cc-4ee5-4c45-9bf0-49f496ed709b" (UID: "a186f0cc-4ee5-4c45-9bf0-49f496ed709b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.494179 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299" (OuterVolumeSpecName: "kube-api-access-rf299") pod "a186f0cc-4ee5-4c45-9bf0-49f496ed709b" (UID: "a186f0cc-4ee5-4c45-9bf0-49f496ed709b"). InnerVolumeSpecName "kube-api-access-rf299". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.588334 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf299\" (UniqueName: \"kubernetes.io/projected/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-kube-api-access-rf299\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.588372 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.634416 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a186f0cc-4ee5-4c45-9bf0-49f496ed709b" (UID: "a186f0cc-4ee5-4c45-9bf0-49f496ed709b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.689298 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a186f0cc-4ee5-4c45-9bf0-49f496ed709b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997085 4962 generic.go:334] "Generic (PLEG): container finished" podID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerID="f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" exitCode=0 Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997132 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerDied","Data":"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c"} Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997159 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnkmg" event={"ID":"a186f0cc-4ee5-4c45-9bf0-49f496ed709b","Type":"ContainerDied","Data":"b1a939161b7b0cf6af29a1b96d2fbba2e4c10ba8217fb1c3bc0cdfa63f4d008d"} Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997161 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnkmg" Feb 20 10:39:17 crc kubenswrapper[4962]: I0220 10:39:17.997198 4962 scope.go:117] "RemoveContainer" containerID="f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.043880 4962 scope.go:117] "RemoveContainer" containerID="8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.047011 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.057095 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nnkmg"] Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.076934 4962 scope.go:117] "RemoveContainer" containerID="ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.109816 4962 scope.go:117] "RemoveContainer" containerID="f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" Feb 20 10:39:18 crc kubenswrapper[4962]: E0220 10:39:18.110573 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c\": container with ID starting with f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c not found: ID does not exist" containerID="f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.110800 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c"} err="failed to get container status \"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c\": rpc error: code = NotFound desc = could not find container \"f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c\": container with ID starting with f1f58afb4858bef31dd12edf1e27343fed1bc81c5ef9f37ac2fe4f8f6ea78d5c not found: ID does not exist" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.111087 4962 scope.go:117] "RemoveContainer" containerID="8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d" Feb 20 10:39:18 crc kubenswrapper[4962]: E0220 10:39:18.112018 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d\": container with ID starting with 8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d not found: ID does not exist" containerID="8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.112078 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d"} err="failed to get container status \"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d\": rpc error: code = NotFound desc = could not find container \"8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d\": container with ID starting with 8de815bc2a7f7e43d35b3fb189422a94f098f63b110daf74939288058c35720d not found: ID does not exist" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.112120 4962 scope.go:117] "RemoveContainer" containerID="ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82" Feb 20 10:39:18 crc kubenswrapper[4962]: E0220 10:39:18.112535 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82\": container with ID starting with ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82 not found: ID does not exist" containerID="ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82" Feb 20 10:39:18 crc kubenswrapper[4962]: I0220 10:39:18.112717 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82"} err="failed to get container status \"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82\": rpc error: code = NotFound desc = could not find container \"ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82\": container with ID starting with ef5252e75a2fb2287ff95c4efdd2c350916894655588c1384a4f97f7b863fb82 not found: ID does not exist" Feb 20 10:39:19 crc kubenswrapper[4962]: I0220 10:39:19.152634 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" path="/var/lib/kubelet/pods/a186f0cc-4ee5-4c45-9bf0-49f496ed709b/volumes" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.705218 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:37 crc kubenswrapper[4962]: E0220 10:39:37.706411 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="extract-content" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.706434 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="extract-content" Feb 20 10:39:37 crc kubenswrapper[4962]: E0220 10:39:37.706456 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.706471 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" Feb 20 10:39:37 crc kubenswrapper[4962]: E0220 10:39:37.706508 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="extract-utilities" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.706524 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="extract-utilities" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.706848 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a186f0cc-4ee5-4c45-9bf0-49f496ed709b" containerName="registry-server" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.708700 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.722329 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.728207 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.728280 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.728501 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.830601 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.830737 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.830773 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.831434 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.831445 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:37 crc kubenswrapper[4962]: I0220 10:39:37.862500 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") pod \"community-operators-xw4c8\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:38 crc kubenswrapper[4962]: I0220 10:39:38.032336 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:38 crc kubenswrapper[4962]: I0220 10:39:38.286454 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:39 crc kubenswrapper[4962]: I0220 10:39:39.191546 4962 generic.go:334] "Generic (PLEG): container finished" podID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerID="cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60" exitCode=0 Feb 20 10:39:39 crc kubenswrapper[4962]: I0220 10:39:39.191637 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerDied","Data":"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60"} Feb 20 10:39:39 crc kubenswrapper[4962]: I0220 10:39:39.191664 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerStarted","Data":"ee15b0b06731212c4b415a3b4b111773ba27381417897da4370a68c602214e9d"} Feb 20 10:39:40 crc kubenswrapper[4962]: I0220 10:39:40.199425 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerStarted","Data":"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7"} Feb 20 10:39:41 crc kubenswrapper[4962]: I0220 10:39:41.216552 4962 generic.go:334] "Generic (PLEG): container finished" podID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerID="8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7" exitCode=0 Feb 20 10:39:41 crc kubenswrapper[4962]: I0220 10:39:41.216646 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerDied","Data":"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7"} Feb 20 10:39:41 crc kubenswrapper[4962]: I0220 10:39:41.508170 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:39:41 crc kubenswrapper[4962]: I0220 10:39:41.508245 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:39:42 crc kubenswrapper[4962]: I0220 10:39:42.230001 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerStarted","Data":"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e"} Feb 20 10:39:42 crc kubenswrapper[4962]: I0220 10:39:42.272179 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xw4c8" podStartSLOduration=2.713510833 podStartE2EDuration="5.272144447s" podCreationTimestamp="2026-02-20 10:39:37 +0000 UTC" firstStartedPulling="2026-02-20 10:39:39.199944389 +0000 UTC m=+2670.782416265" lastFinishedPulling="2026-02-20 10:39:41.758578023 +0000 UTC m=+2673.341049879" observedRunningTime="2026-02-20 10:39:42.256833872 +0000 UTC m=+2673.839305758" watchObservedRunningTime="2026-02-20 10:39:42.272144447 +0000 UTC m=+2673.854616323" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.032695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.033342 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.111116 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.370966 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:48 crc kubenswrapper[4962]: I0220 10:39:48.437763 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:50 crc kubenswrapper[4962]: I0220 10:39:50.317685 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xw4c8" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="registry-server" containerID="cri-o://589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" gracePeriod=2 Feb 20 10:39:50 crc kubenswrapper[4962]: I0220 10:39:50.857918 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.044097 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") pod \"015b23ac-0880-43e0-b6f1-cfc724c572df\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.044729 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") pod \"015b23ac-0880-43e0-b6f1-cfc724c572df\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.045073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") pod \"015b23ac-0880-43e0-b6f1-cfc724c572df\" (UID: \"015b23ac-0880-43e0-b6f1-cfc724c572df\") " Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.046233 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities" (OuterVolumeSpecName: "utilities") pod "015b23ac-0880-43e0-b6f1-cfc724c572df" (UID: "015b23ac-0880-43e0-b6f1-cfc724c572df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.053831 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4" (OuterVolumeSpecName: "kube-api-access-xkcz4") pod "015b23ac-0880-43e0-b6f1-cfc724c572df" (UID: "015b23ac-0880-43e0-b6f1-cfc724c572df"). InnerVolumeSpecName "kube-api-access-xkcz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.125891 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "015b23ac-0880-43e0-b6f1-cfc724c572df" (UID: "015b23ac-0880-43e0-b6f1-cfc724c572df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.146898 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkcz4\" (UniqueName: \"kubernetes.io/projected/015b23ac-0880-43e0-b6f1-cfc724c572df-kube-api-access-xkcz4\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.146943 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.146962 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015b23ac-0880-43e0-b6f1-cfc724c572df-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.330887 4962 generic.go:334] "Generic (PLEG): container finished" podID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerID="589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" exitCode=0 Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.330946 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerDied","Data":"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e"} Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.330982 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xw4c8" event={"ID":"015b23ac-0880-43e0-b6f1-cfc724c572df","Type":"ContainerDied","Data":"ee15b0b06731212c4b415a3b4b111773ba27381417897da4370a68c602214e9d"} Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.331009 4962 scope.go:117] "RemoveContainer" containerID="589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.331191 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xw4c8" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.369633 4962 scope.go:117] "RemoveContainer" containerID="8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.519647 4962 scope.go:117] "RemoveContainer" containerID="cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.522796 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.533791 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xw4c8"] Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.539934 4962 scope.go:117] "RemoveContainer" containerID="589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" Feb 20 10:39:51 crc kubenswrapper[4962]: E0220 10:39:51.540384 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e\": container with ID starting with 589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e not found: ID does not exist" containerID="589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.540423 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e"} err="failed to get container status \"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e\": rpc error: code = NotFound desc = could not find container \"589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e\": container with ID starting with 589ce8025773722911191bac7084bb109757fdfd6a3ce35d8e8e06091a6a102e not found: ID does not exist" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.540455 4962 scope.go:117] "RemoveContainer" containerID="8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7" Feb 20 10:39:51 crc kubenswrapper[4962]: E0220 10:39:51.540872 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7\": container with ID starting with 8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7 not found: ID does not exist" containerID="8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.540899 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7"} err="failed to get container status \"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7\": rpc error: code = NotFound desc = could not find container \"8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7\": container with ID starting with 8bf555f04528aded810ccce366ec339de2275404763e6b36674d3e4056f486f7 not found: ID does not exist" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.540917 4962 scope.go:117] "RemoveContainer" containerID="cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60" Feb 20 10:39:51 crc kubenswrapper[4962]: E0220 10:39:51.542202 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60\": container with ID starting with cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60 not found: ID does not exist" containerID="cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60" Feb 20 10:39:51 crc kubenswrapper[4962]: I0220 10:39:51.542250 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60"} err="failed to get container status \"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60\": rpc error: code = NotFound desc = could not find container \"cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60\": container with ID starting with cd16b0eb2e4345333e5746fb5db5b5338c0c3411fdd0f34a6170095d59c59a60 not found: ID does not exist" Feb 20 10:39:53 crc kubenswrapper[4962]: I0220 10:39:53.149614 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" path="/var/lib/kubelet/pods/015b23ac-0880-43e0-b6f1-cfc724c572df/volumes" Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.508312 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.510734 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.510927 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.512011 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:40:11 crc kubenswrapper[4962]: I0220 10:40:11.512307 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f" gracePeriod=600 Feb 20 10:40:12 crc kubenswrapper[4962]: I0220 10:40:12.540142 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f" exitCode=0 Feb 20 10:40:12 crc kubenswrapper[4962]: I0220 10:40:12.540229 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f"} Feb 20 10:40:12 crc kubenswrapper[4962]: I0220 10:40:12.540564 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6"} Feb 20 10:40:12 crc kubenswrapper[4962]: I0220 10:40:12.540626 4962 scope.go:117] "RemoveContainer" containerID="bd64da6639184a23e03b94b79a7e7b45218fcf22df547dc77c2a997cc3799a1e" Feb 20 10:42:11 crc kubenswrapper[4962]: I0220 10:42:11.508573 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:42:11 crc kubenswrapper[4962]: I0220 10:42:11.509263 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:42:41 crc kubenswrapper[4962]: I0220 10:42:41.508161 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:42:41 crc kubenswrapper[4962]: I0220 10:42:41.508861 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.219577 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:09 crc kubenswrapper[4962]: E0220 10:43:09.220574 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="registry-server" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.220963 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="registry-server" Feb 20 10:43:09 crc kubenswrapper[4962]: E0220 10:43:09.221005 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="extract-content" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.221017 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="extract-content" Feb 20 10:43:09 crc kubenswrapper[4962]: E0220 10:43:09.221030 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="extract-utilities" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.221038 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="extract-utilities" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.221223 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="015b23ac-0880-43e0-b6f1-cfc724c572df" containerName="registry-server" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.222831 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.235161 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.301687 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.301741 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.301782 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.403433 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.403543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.403622 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.404169 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.404347 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.427580 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") pod \"certified-operators-4hqgh\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.543978 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:09 crc kubenswrapper[4962]: I0220 10:43:09.832672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:10 crc kubenswrapper[4962]: I0220 10:43:10.137892 4962 generic.go:334] "Generic (PLEG): container finished" podID="2df2af41-d54c-427f-91e9-b132958cb597" containerID="2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3" exitCode=0 Feb 20 10:43:10 crc kubenswrapper[4962]: I0220 10:43:10.137929 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerDied","Data":"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3"} Feb 20 10:43:10 crc kubenswrapper[4962]: I0220 10:43:10.137953 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerStarted","Data":"f279a063082fe55c064c2ef5edc718798c07fda7d9d3ba9b3569442ae2603b1d"} Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.156373 4962 generic.go:334] "Generic (PLEG): container finished" podID="2df2af41-d54c-427f-91e9-b132958cb597" containerID="8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b" exitCode=0 Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.158803 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerDied","Data":"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b"} Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.508667 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.509011 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.509086 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.510096 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:43:11 crc kubenswrapper[4962]: I0220 10:43:11.510218 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" gracePeriod=600 Feb 20 10:43:11 crc kubenswrapper[4962]: E0220 10:43:11.654681 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.167379 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" exitCode=0 Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.167477 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6"} Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.167519 4962 scope.go:117] "RemoveContainer" containerID="42a27fff0d251c257e515176e7d37cbd4d1c37cf56ff04f11c04672d654f700f" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.168050 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:43:12 crc kubenswrapper[4962]: E0220 10:43:12.168254 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.170710 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerStarted","Data":"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f"} Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.229528 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.231540 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.240901 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.242469 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4hqgh" podStartSLOduration=1.834561292 podStartE2EDuration="3.24244878s" podCreationTimestamp="2026-02-20 10:43:09 +0000 UTC" firstStartedPulling="2026-02-20 10:43:10.139339773 +0000 UTC m=+2881.721811619" lastFinishedPulling="2026-02-20 10:43:11.547227231 +0000 UTC m=+2883.129699107" observedRunningTime="2026-02-20 10:43:12.23177347 +0000 UTC m=+2883.814245316" watchObservedRunningTime="2026-02-20 10:43:12.24244878 +0000 UTC m=+2883.824920636" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.246617 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.246665 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.246731 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348272 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348378 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348403 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348788 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.348919 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.365322 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") pod \"redhat-marketplace-jb2lz\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:12 crc kubenswrapper[4962]: I0220 10:43:12.553732 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:13 crc kubenswrapper[4962]: I0220 10:43:13.027279 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:13 crc kubenswrapper[4962]: I0220 10:43:13.181551 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerStarted","Data":"a6a7be7df0f34627d00563e97fbdf2a8011dd3ee63192c4a0e2864c2140079df"} Feb 20 10:43:14 crc kubenswrapper[4962]: I0220 10:43:14.191391 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerID="46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8" exitCode=0 Feb 20 10:43:14 crc kubenswrapper[4962]: I0220 10:43:14.191453 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerDied","Data":"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8"} Feb 20 10:43:15 crc kubenswrapper[4962]: I0220 10:43:15.205008 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerID="43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad" exitCode=0 Feb 20 10:43:15 crc kubenswrapper[4962]: I0220 10:43:15.205174 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerDied","Data":"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad"} Feb 20 10:43:16 crc kubenswrapper[4962]: I0220 10:43:16.230121 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerStarted","Data":"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79"} Feb 20 10:43:16 crc kubenswrapper[4962]: I0220 10:43:16.258014 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jb2lz" podStartSLOduration=2.8271723619999998 podStartE2EDuration="4.25799914s" podCreationTimestamp="2026-02-20 10:43:12 +0000 UTC" firstStartedPulling="2026-02-20 10:43:14.193992834 +0000 UTC m=+2885.776464710" lastFinishedPulling="2026-02-20 10:43:15.624819602 +0000 UTC m=+2887.207291488" observedRunningTime="2026-02-20 10:43:16.25639464 +0000 UTC m=+2887.838866496" watchObservedRunningTime="2026-02-20 10:43:16.25799914 +0000 UTC m=+2887.840470986" Feb 20 10:43:19 crc kubenswrapper[4962]: I0220 10:43:19.544349 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:19 crc kubenswrapper[4962]: I0220 10:43:19.544818 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:19 crc kubenswrapper[4962]: I0220 10:43:19.629023 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:20 crc kubenswrapper[4962]: I0220 10:43:20.324086 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:22 crc kubenswrapper[4962]: I0220 10:43:22.554404 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:22 crc kubenswrapper[4962]: I0220 10:43:22.554840 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:22 crc kubenswrapper[4962]: I0220 10:43:22.627898 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:23 crc kubenswrapper[4962]: I0220 10:43:23.140278 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:43:23 crc kubenswrapper[4962]: E0220 10:43:23.140680 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:43:23 crc kubenswrapper[4962]: I0220 10:43:23.364868 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.396374 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.396622 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4hqgh" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="registry-server" containerID="cri-o://6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" gracePeriod=2 Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.876570 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.988532 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") pod \"2df2af41-d54c-427f-91e9-b132958cb597\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.988670 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") pod \"2df2af41-d54c-427f-91e9-b132958cb597\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.988745 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") pod \"2df2af41-d54c-427f-91e9-b132958cb597\" (UID: \"2df2af41-d54c-427f-91e9-b132958cb597\") " Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.990843 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities" (OuterVolumeSpecName: "utilities") pod "2df2af41-d54c-427f-91e9-b132958cb597" (UID: "2df2af41-d54c-427f-91e9-b132958cb597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:43:24 crc kubenswrapper[4962]: I0220 10:43:24.999488 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx" (OuterVolumeSpecName: "kube-api-access-5ckqx") pod "2df2af41-d54c-427f-91e9-b132958cb597" (UID: "2df2af41-d54c-427f-91e9-b132958cb597"). InnerVolumeSpecName "kube-api-access-5ckqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.087338 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2df2af41-d54c-427f-91e9-b132958cb597" (UID: "2df2af41-d54c-427f-91e9-b132958cb597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.094310 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.094375 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ckqx\" (UniqueName: \"kubernetes.io/projected/2df2af41-d54c-427f-91e9-b132958cb597-kube-api-access-5ckqx\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.094410 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df2af41-d54c-427f-91e9-b132958cb597-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.318884 4962 generic.go:334] "Generic (PLEG): container finished" podID="2df2af41-d54c-427f-91e9-b132958cb597" containerID="6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" exitCode=0 Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.319046 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hqgh" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.319655 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerDied","Data":"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f"} Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.319923 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hqgh" event={"ID":"2df2af41-d54c-427f-91e9-b132958cb597","Type":"ContainerDied","Data":"f279a063082fe55c064c2ef5edc718798c07fda7d9d3ba9b3569442ae2603b1d"} Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.320009 4962 scope.go:117] "RemoveContainer" containerID="6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.349834 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.355817 4962 scope.go:117] "RemoveContainer" containerID="8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.357456 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4hqgh"] Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.384792 4962 scope.go:117] "RemoveContainer" containerID="2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.435186 4962 scope.go:117] "RemoveContainer" containerID="6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" Feb 20 10:43:25 crc kubenswrapper[4962]: E0220 10:43:25.435823 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f\": container with ID starting with 6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f not found: ID does not exist" containerID="6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.435887 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f"} err="failed to get container status \"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f\": rpc error: code = NotFound desc = could not find container \"6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f\": container with ID starting with 6309cb9ac4695955a5d865f5d3edcee2a80f38074e5ddace45d90917bf08419f not found: ID does not exist" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.435931 4962 scope.go:117] "RemoveContainer" containerID="8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b" Feb 20 10:43:25 crc kubenswrapper[4962]: E0220 10:43:25.436535 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b\": container with ID starting with 8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b not found: ID does not exist" containerID="8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.436735 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b"} err="failed to get container status \"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b\": rpc error: code = NotFound desc = could not find container \"8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b\": container with ID starting with 8a800b7264aa9cc1afe2eba1dc5df9bcafa1684d0e54ec03d724e8037770100b not found: ID does not exist" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.436781 4962 scope.go:117] "RemoveContainer" containerID="2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3" Feb 20 10:43:25 crc kubenswrapper[4962]: E0220 10:43:25.437359 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3\": container with ID starting with 2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3 not found: ID does not exist" containerID="2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3" Feb 20 10:43:25 crc kubenswrapper[4962]: I0220 10:43:25.437405 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3"} err="failed to get container status \"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3\": rpc error: code = NotFound desc = could not find container \"2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3\": container with ID starting with 2bee1415b07c7340feca04db82732263d5777b50406d923bc0748296ac496cf3 not found: ID does not exist" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.154774 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df2af41-d54c-427f-91e9-b132958cb597" path="/var/lib/kubelet/pods/2df2af41-d54c-427f-91e9-b132958cb597/volumes" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.412355 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.412715 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jb2lz" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="registry-server" containerID="cri-o://84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" gracePeriod=2 Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.899238 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.943993 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") pod \"ff762b96-9d98-406b-81b4-b81b19473e0e\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.944118 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") pod \"ff762b96-9d98-406b-81b4-b81b19473e0e\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.944224 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") pod \"ff762b96-9d98-406b-81b4-b81b19473e0e\" (UID: \"ff762b96-9d98-406b-81b4-b81b19473e0e\") " Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.945328 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities" (OuterVolumeSpecName: "utilities") pod "ff762b96-9d98-406b-81b4-b81b19473e0e" (UID: "ff762b96-9d98-406b-81b4-b81b19473e0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.952039 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs" (OuterVolumeSpecName: "kube-api-access-2qghs") pod "ff762b96-9d98-406b-81b4-b81b19473e0e" (UID: "ff762b96-9d98-406b-81b4-b81b19473e0e"). InnerVolumeSpecName "kube-api-access-2qghs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:43:27 crc kubenswrapper[4962]: I0220 10:43:27.983148 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff762b96-9d98-406b-81b4-b81b19473e0e" (UID: "ff762b96-9d98-406b-81b4-b81b19473e0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.046091 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.046134 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qghs\" (UniqueName: \"kubernetes.io/projected/ff762b96-9d98-406b-81b4-b81b19473e0e-kube-api-access-2qghs\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.046152 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff762b96-9d98-406b-81b4-b81b19473e0e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.352434 4962 generic.go:334] "Generic (PLEG): container finished" podID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerID="84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" exitCode=0 Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.352512 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jb2lz" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.352525 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerDied","Data":"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79"} Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.353306 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jb2lz" event={"ID":"ff762b96-9d98-406b-81b4-b81b19473e0e","Type":"ContainerDied","Data":"a6a7be7df0f34627d00563e97fbdf2a8011dd3ee63192c4a0e2864c2140079df"} Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.353357 4962 scope.go:117] "RemoveContainer" containerID="84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.387228 4962 scope.go:117] "RemoveContainer" containerID="43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.416525 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.425110 4962 scope.go:117] "RemoveContainer" containerID="46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.430282 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jb2lz"] Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.466727 4962 scope.go:117] "RemoveContainer" containerID="84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" Feb 20 10:43:28 crc kubenswrapper[4962]: E0220 10:43:28.467755 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79\": container with ID starting with 84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79 not found: ID does not exist" containerID="84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.467841 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79"} err="failed to get container status \"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79\": rpc error: code = NotFound desc = could not find container \"84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79\": container with ID starting with 84b881cd2e78a6ef6a145077a3ae6490958cb1323c91b5c6e8ae66a612485e79 not found: ID does not exist" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.467884 4962 scope.go:117] "RemoveContainer" containerID="43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad" Feb 20 10:43:28 crc kubenswrapper[4962]: E0220 10:43:28.468367 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad\": container with ID starting with 43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad not found: ID does not exist" containerID="43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.468429 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad"} err="failed to get container status \"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad\": rpc error: code = NotFound desc = could not find container \"43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad\": container with ID starting with 43b9890e9e6e71cebe87f4226c2aab3ba74cadc1bd011d57f7010a2caaeae0ad not found: ID does not exist" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.468468 4962 scope.go:117] "RemoveContainer" containerID="46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8" Feb 20 10:43:28 crc kubenswrapper[4962]: E0220 10:43:28.469444 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8\": container with ID starting with 46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8 not found: ID does not exist" containerID="46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8" Feb 20 10:43:28 crc kubenswrapper[4962]: I0220 10:43:28.469500 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8"} err="failed to get container status \"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8\": rpc error: code = NotFound desc = could not find container \"46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8\": container with ID starting with 46f1bad0ce80050e390a8e3901f88dccee8ae5368654726522435dc062d95bc8 not found: ID does not exist" Feb 20 10:43:29 crc kubenswrapper[4962]: I0220 10:43:29.155452 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" path="/var/lib/kubelet/pods/ff762b96-9d98-406b-81b4-b81b19473e0e/volumes" Feb 20 10:43:36 crc kubenswrapper[4962]: I0220 10:43:36.138806 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:43:36 crc kubenswrapper[4962]: E0220 10:43:36.141452 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:43:51 crc kubenswrapper[4962]: I0220 10:43:51.139925 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:43:51 crc kubenswrapper[4962]: E0220 10:43:51.140935 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:03 crc kubenswrapper[4962]: I0220 10:44:03.139512 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:03 crc kubenswrapper[4962]: E0220 10:44:03.140084 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:15 crc kubenswrapper[4962]: I0220 10:44:15.139323 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:15 crc kubenswrapper[4962]: E0220 10:44:15.140949 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:29 crc kubenswrapper[4962]: I0220 10:44:29.148879 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:29 crc kubenswrapper[4962]: E0220 10:44:29.149831 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:43 crc kubenswrapper[4962]: I0220 10:44:43.139860 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:43 crc kubenswrapper[4962]: E0220 10:44:43.142267 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:44:57 crc kubenswrapper[4962]: I0220 10:44:57.138930 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:44:57 crc kubenswrapper[4962]: E0220 10:44:57.140202 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.169073 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg"] Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.169907 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.169934 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.169955 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="extract-content" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.169968 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="extract-content" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.170000 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="extract-utilities" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170083 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="extract-utilities" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.170142 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="extract-utilities" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170156 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="extract-utilities" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.170176 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170192 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: E0220 10:45:00.170227 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="extract-content" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170239 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="extract-content" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170516 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff762b96-9d98-406b-81b4-b81b19473e0e" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.170554 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df2af41-d54c-427f-91e9-b132958cb597" containerName="registry-server" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.171301 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.174908 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.174967 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.181427 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg"] Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.269025 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.269099 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.269390 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.370632 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.370734 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.370960 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.372216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.389782 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.401789 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") pod \"collect-profiles-29526405-2m7jg\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.509287 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:00 crc kubenswrapper[4962]: I0220 10:45:00.805768 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg"] Feb 20 10:45:01 crc kubenswrapper[4962]: I0220 10:45:01.222410 4962 generic.go:334] "Generic (PLEG): container finished" podID="96b7caa8-b0e3-456c-88a2-da2e0e66d681" containerID="8e3990ca316d03e419fd0ebb840f5195fe2ec512b8606d9e1a57ef65654b00e5" exitCode=0 Feb 20 10:45:01 crc kubenswrapper[4962]: I0220 10:45:01.222550 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" event={"ID":"96b7caa8-b0e3-456c-88a2-da2e0e66d681","Type":"ContainerDied","Data":"8e3990ca316d03e419fd0ebb840f5195fe2ec512b8606d9e1a57ef65654b00e5"} Feb 20 10:45:01 crc kubenswrapper[4962]: I0220 10:45:01.222784 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" event={"ID":"96b7caa8-b0e3-456c-88a2-da2e0e66d681","Type":"ContainerStarted","Data":"70795cd15c887297d3c34c43572bd8b4b2ba6d456853ef1131e299d78063bee8"} Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.583905 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.705675 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") pod \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.705917 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") pod \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.705988 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") pod \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\" (UID: \"96b7caa8-b0e3-456c-88a2-da2e0e66d681\") " Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.707195 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume" (OuterVolumeSpecName: "config-volume") pod "96b7caa8-b0e3-456c-88a2-da2e0e66d681" (UID: "96b7caa8-b0e3-456c-88a2-da2e0e66d681"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.714682 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "96b7caa8-b0e3-456c-88a2-da2e0e66d681" (UID: "96b7caa8-b0e3-456c-88a2-da2e0e66d681"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.714742 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8" (OuterVolumeSpecName: "kube-api-access-qmhl8") pod "96b7caa8-b0e3-456c-88a2-da2e0e66d681" (UID: "96b7caa8-b0e3-456c-88a2-da2e0e66d681"). InnerVolumeSpecName "kube-api-access-qmhl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.807834 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/96b7caa8-b0e3-456c-88a2-da2e0e66d681-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.807888 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96b7caa8-b0e3-456c-88a2-da2e0e66d681-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:45:02 crc kubenswrapper[4962]: I0220 10:45:02.807907 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmhl8\" (UniqueName: \"kubernetes.io/projected/96b7caa8-b0e3-456c-88a2-da2e0e66d681-kube-api-access-qmhl8\") on node \"crc\" DevicePath \"\"" Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.245090 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" event={"ID":"96b7caa8-b0e3-456c-88a2-da2e0e66d681","Type":"ContainerDied","Data":"70795cd15c887297d3c34c43572bd8b4b2ba6d456853ef1131e299d78063bee8"} Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.245148 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526405-2m7jg" Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.245171 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70795cd15c887297d3c34c43572bd8b4b2ba6d456853ef1131e299d78063bee8" Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.676115 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:45:03 crc kubenswrapper[4962]: I0220 10:45:03.684874 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-m2h52"] Feb 20 10:45:05 crc kubenswrapper[4962]: I0220 10:45:05.155752 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e" path="/var/lib/kubelet/pods/d4d2cbc3-9bc4-4270-9d26-66c3e9189f8e/volumes" Feb 20 10:45:10 crc kubenswrapper[4962]: I0220 10:45:10.138652 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:45:10 crc kubenswrapper[4962]: E0220 10:45:10.139366 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:45:16 crc kubenswrapper[4962]: I0220 10:45:16.294235 4962 scope.go:117] "RemoveContainer" containerID="9e4ff8bca8b9c2f6e4f08722be3898de4e9890a93fcdc65b9b078dd1d1fbdae2" Feb 20 10:45:22 crc kubenswrapper[4962]: I0220 10:45:22.139563 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:45:22 crc kubenswrapper[4962]: E0220 10:45:22.142026 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:45:35 crc kubenswrapper[4962]: I0220 10:45:35.140091 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:45:35 crc kubenswrapper[4962]: E0220 10:45:35.141118 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:45:47 crc kubenswrapper[4962]: I0220 10:45:47.138791 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:45:47 crc kubenswrapper[4962]: E0220 10:45:47.139491 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:01 crc kubenswrapper[4962]: I0220 10:46:01.140719 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:01 crc kubenswrapper[4962]: E0220 10:46:01.141636 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:16 crc kubenswrapper[4962]: I0220 10:46:16.139190 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:16 crc kubenswrapper[4962]: E0220 10:46:16.140302 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:31 crc kubenswrapper[4962]: I0220 10:46:31.139548 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:31 crc kubenswrapper[4962]: E0220 10:46:31.140575 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:45 crc kubenswrapper[4962]: I0220 10:46:45.141647 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:45 crc kubenswrapper[4962]: E0220 10:46:45.142570 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:46:56 crc kubenswrapper[4962]: I0220 10:46:56.139544 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:46:56 crc kubenswrapper[4962]: E0220 10:46:56.140669 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:47:07 crc kubenswrapper[4962]: I0220 10:47:07.139845 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:47:07 crc kubenswrapper[4962]: E0220 10:47:07.140866 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:47:21 crc kubenswrapper[4962]: I0220 10:47:21.139360 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:47:21 crc kubenswrapper[4962]: E0220 10:47:21.140631 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:47:33 crc kubenswrapper[4962]: I0220 10:47:33.139891 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:47:33 crc kubenswrapper[4962]: E0220 10:47:33.141133 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:47:47 crc kubenswrapper[4962]: I0220 10:47:47.139552 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:47:47 crc kubenswrapper[4962]: E0220 10:47:47.140659 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:48:02 crc kubenswrapper[4962]: I0220 10:48:02.139045 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:48:02 crc kubenswrapper[4962]: E0220 10:48:02.140037 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:48:14 crc kubenswrapper[4962]: I0220 10:48:14.138705 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:48:14 crc kubenswrapper[4962]: I0220 10:48:14.977010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059"} Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.386679 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:19 crc kubenswrapper[4962]: E0220 10:49:19.387453 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b7caa8-b0e3-456c-88a2-da2e0e66d681" containerName="collect-profiles" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.387467 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b7caa8-b0e3-456c-88a2-da2e0e66d681" containerName="collect-profiles" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.387652 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b7caa8-b0e3-456c-88a2-da2e0e66d681" containerName="collect-profiles" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.388738 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.408369 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.535652 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.535711 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.535759 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.637320 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.637360 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.637391 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.638030 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.638237 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.687401 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") pod \"redhat-operators-ksv8l\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:19 crc kubenswrapper[4962]: I0220 10:49:19.719171 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.131902 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.550406 4962 generic.go:334] "Generic (PLEG): container finished" podID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerID="65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f" exitCode=0 Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.550451 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerDied","Data":"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f"} Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.550505 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerStarted","Data":"9cdd274a165a9e9b2e89ca49f289b6d7fe464745bb688d5b654d0aaabd0db097"} Feb 20 10:49:20 crc kubenswrapper[4962]: I0220 10:49:20.552686 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:49:21 crc kubenswrapper[4962]: I0220 10:49:21.559512 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerStarted","Data":"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89"} Feb 20 10:49:22 crc kubenswrapper[4962]: I0220 10:49:22.573050 4962 generic.go:334] "Generic (PLEG): container finished" podID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerID="dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89" exitCode=0 Feb 20 10:49:22 crc kubenswrapper[4962]: I0220 10:49:22.573186 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerDied","Data":"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89"} Feb 20 10:49:23 crc kubenswrapper[4962]: I0220 10:49:23.582991 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerStarted","Data":"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e"} Feb 20 10:49:23 crc kubenswrapper[4962]: I0220 10:49:23.612160 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ksv8l" podStartSLOduration=2.177121477 podStartE2EDuration="4.612132471s" podCreationTimestamp="2026-02-20 10:49:19 +0000 UTC" firstStartedPulling="2026-02-20 10:49:20.552303606 +0000 UTC m=+3252.134775462" lastFinishedPulling="2026-02-20 10:49:22.98731457 +0000 UTC m=+3254.569786456" observedRunningTime="2026-02-20 10:49:23.610439988 +0000 UTC m=+3255.192911864" watchObservedRunningTime="2026-02-20 10:49:23.612132471 +0000 UTC m=+3255.194604347" Feb 20 10:49:29 crc kubenswrapper[4962]: I0220 10:49:29.720018 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:29 crc kubenswrapper[4962]: I0220 10:49:29.721169 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:30 crc kubenswrapper[4962]: I0220 10:49:30.794341 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ksv8l" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" probeResult="failure" output=< Feb 20 10:49:30 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:49:30 crc kubenswrapper[4962]: > Feb 20 10:49:39 crc kubenswrapper[4962]: I0220 10:49:39.795783 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:39 crc kubenswrapper[4962]: I0220 10:49:39.881505 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:40 crc kubenswrapper[4962]: I0220 10:49:40.047897 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:41 crc kubenswrapper[4962]: I0220 10:49:41.748143 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ksv8l" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" containerID="cri-o://e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" gracePeriod=2 Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.239222 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.332183 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") pod \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.332227 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") pod \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.332381 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") pod \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\" (UID: \"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c\") " Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.333421 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities" (OuterVolumeSpecName: "utilities") pod "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" (UID: "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.342978 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx" (OuterVolumeSpecName: "kube-api-access-g4tfx") pod "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" (UID: "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c"). InnerVolumeSpecName "kube-api-access-g4tfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.434347 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.434403 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4tfx\" (UniqueName: \"kubernetes.io/projected/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-kube-api-access-g4tfx\") on node \"crc\" DevicePath \"\"" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.520430 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" (UID: "9f7ade93-bdc4-4c9e-92d8-dc2bc360308c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.535734 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759521 4962 generic.go:334] "Generic (PLEG): container finished" podID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerID="e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" exitCode=0 Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759561 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerDied","Data":"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e"} Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759607 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ksv8l" event={"ID":"9f7ade93-bdc4-4c9e-92d8-dc2bc360308c","Type":"ContainerDied","Data":"9cdd274a165a9e9b2e89ca49f289b6d7fe464745bb688d5b654d0aaabd0db097"} Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759628 4962 scope.go:117] "RemoveContainer" containerID="e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.759635 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ksv8l" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.788450 4962 scope.go:117] "RemoveContainer" containerID="dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.824308 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.833964 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ksv8l"] Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.844514 4962 scope.go:117] "RemoveContainer" containerID="65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.867453 4962 scope.go:117] "RemoveContainer" containerID="e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" Feb 20 10:49:42 crc kubenswrapper[4962]: E0220 10:49:42.868116 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e\": container with ID starting with e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e not found: ID does not exist" containerID="e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868154 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e"} err="failed to get container status \"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e\": rpc error: code = NotFound desc = could not find container \"e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e\": container with ID starting with e33cfc2d65171212d31a874e2b9fb6c01ba8974d3fc4058300e04403d1b1288e not found: ID does not exist" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868182 4962 scope.go:117] "RemoveContainer" containerID="dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89" Feb 20 10:49:42 crc kubenswrapper[4962]: E0220 10:49:42.868487 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89\": container with ID starting with dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89 not found: ID does not exist" containerID="dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868516 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89"} err="failed to get container status \"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89\": rpc error: code = NotFound desc = could not find container \"dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89\": container with ID starting with dd2aa04ab75013f9c8a969c7bdc5b7b3dd04b7f2bd6bcbaddb10bef15cfceb89 not found: ID does not exist" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868538 4962 scope.go:117] "RemoveContainer" containerID="65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f" Feb 20 10:49:42 crc kubenswrapper[4962]: E0220 10:49:42.868814 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f\": container with ID starting with 65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f not found: ID does not exist" containerID="65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f" Feb 20 10:49:42 crc kubenswrapper[4962]: I0220 10:49:42.868836 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f"} err="failed to get container status \"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f\": rpc error: code = NotFound desc = could not find container \"65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f\": container with ID starting with 65f47ed8d658ebb6e3d14cbe28e715c32d99c7606c31cf79047c67bb0f17c08f not found: ID does not exist" Feb 20 10:49:43 crc kubenswrapper[4962]: I0220 10:49:43.154725 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" path="/var/lib/kubelet/pods/9f7ade93-bdc4-4c9e-92d8-dc2bc360308c/volumes" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.113493 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:49:53 crc kubenswrapper[4962]: E0220 10:49:53.114431 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="extract-content" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.114448 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="extract-content" Feb 20 10:49:53 crc kubenswrapper[4962]: E0220 10:49:53.114470 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.114477 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" Feb 20 10:49:53 crc kubenswrapper[4962]: E0220 10:49:53.114493 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="extract-utilities" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.114501 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="extract-utilities" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.114697 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f7ade93-bdc4-4c9e-92d8-dc2bc360308c" containerName="registry-server" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.115631 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.125933 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.204865 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.204939 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.204961 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.306447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.306541 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.306579 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.307085 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.307102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.325890 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") pod \"community-operators-hm46k\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.430062 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:49:53 crc kubenswrapper[4962]: W0220 10:49:53.956589 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f06592_8c43_4e8e_95aa_0d2ec94e7cfb.slice/crio-5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025 WatchSource:0}: Error finding container 5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025: Status 404 returned error can't find the container with id 5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025 Feb 20 10:49:53 crc kubenswrapper[4962]: I0220 10:49:53.958969 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:49:54 crc kubenswrapper[4962]: I0220 10:49:54.886899 4962 generic.go:334] "Generic (PLEG): container finished" podID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerID="d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9" exitCode=0 Feb 20 10:49:54 crc kubenswrapper[4962]: I0220 10:49:54.887179 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerDied","Data":"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9"} Feb 20 10:49:54 crc kubenswrapper[4962]: I0220 10:49:54.887339 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerStarted","Data":"5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025"} Feb 20 10:49:56 crc kubenswrapper[4962]: I0220 10:49:56.905748 4962 generic.go:334] "Generic (PLEG): container finished" podID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerID="83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37" exitCode=0 Feb 20 10:49:56 crc kubenswrapper[4962]: I0220 10:49:56.905838 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerDied","Data":"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37"} Feb 20 10:49:57 crc kubenswrapper[4962]: I0220 10:49:57.919703 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerStarted","Data":"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2"} Feb 20 10:49:57 crc kubenswrapper[4962]: I0220 10:49:57.955227 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hm46k" podStartSLOduration=2.558229145 podStartE2EDuration="4.9552001s" podCreationTimestamp="2026-02-20 10:49:53 +0000 UTC" firstStartedPulling="2026-02-20 10:49:54.891119485 +0000 UTC m=+3286.473591361" lastFinishedPulling="2026-02-20 10:49:57.28809044 +0000 UTC m=+3288.870562316" observedRunningTime="2026-02-20 10:49:57.945009857 +0000 UTC m=+3289.527481733" watchObservedRunningTime="2026-02-20 10:49:57.9552001 +0000 UTC m=+3289.537671986" Feb 20 10:50:03 crc kubenswrapper[4962]: I0220 10:50:03.430353 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:03 crc kubenswrapper[4962]: I0220 10:50:03.432043 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:03 crc kubenswrapper[4962]: I0220 10:50:03.509643 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:04 crc kubenswrapper[4962]: I0220 10:50:04.044121 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:04 crc kubenswrapper[4962]: I0220 10:50:04.108261 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:50:05 crc kubenswrapper[4962]: I0220 10:50:05.990430 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hm46k" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="registry-server" containerID="cri-o://1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" gracePeriod=2 Feb 20 10:50:06 crc kubenswrapper[4962]: E0220 10:50:06.142684 4962 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f06592_8c43_4e8e_95aa_0d2ec94e7cfb.slice/crio-1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f06592_8c43_4e8e_95aa_0d2ec94e7cfb.slice/crio-conmon-1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.506567 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.629260 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") pod \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.629379 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") pod \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.629418 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") pod \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\" (UID: \"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb\") " Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.630367 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities" (OuterVolumeSpecName: "utilities") pod "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" (UID: "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.641730 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg" (OuterVolumeSpecName: "kube-api-access-tg7zg") pod "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" (UID: "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb"). InnerVolumeSpecName "kube-api-access-tg7zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.730768 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.731106 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg7zg\" (UniqueName: \"kubernetes.io/projected/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-kube-api-access-tg7zg\") on node \"crc\" DevicePath \"\"" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.802067 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" (UID: "79f06592-8c43-4e8e-95aa-0d2ec94e7cfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:50:06 crc kubenswrapper[4962]: I0220 10:50:06.833250 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003625 4962 generic.go:334] "Generic (PLEG): container finished" podID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerID="1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" exitCode=0 Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003691 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerDied","Data":"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2"} Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003776 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hm46k" event={"ID":"79f06592-8c43-4e8e-95aa-0d2ec94e7cfb","Type":"ContainerDied","Data":"5e67b557916f3a0911817b3479e3ba7772a2ae016088d45d9a2c8521458a5025"} Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003817 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hm46k" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.003827 4962 scope.go:117] "RemoveContainer" containerID="1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.036920 4962 scope.go:117] "RemoveContainer" containerID="83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.054085 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.067455 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hm46k"] Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.072961 4962 scope.go:117] "RemoveContainer" containerID="d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.095236 4962 scope.go:117] "RemoveContainer" containerID="1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" Feb 20 10:50:07 crc kubenswrapper[4962]: E0220 10:50:07.095869 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2\": container with ID starting with 1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2 not found: ID does not exist" containerID="1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.095909 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2"} err="failed to get container status \"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2\": rpc error: code = NotFound desc = could not find container \"1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2\": container with ID starting with 1e4d156917d0104b23ff70ce00f9513df1daa03d92d332e83bb6e3a93f94b6f2 not found: ID does not exist" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.095935 4962 scope.go:117] "RemoveContainer" containerID="83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37" Feb 20 10:50:07 crc kubenswrapper[4962]: E0220 10:50:07.096303 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37\": container with ID starting with 83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37 not found: ID does not exist" containerID="83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.096338 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37"} err="failed to get container status \"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37\": rpc error: code = NotFound desc = could not find container \"83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37\": container with ID starting with 83b854b2ea7e5513230fc5015407fb8c16cb433fac095130d3238eecaaa37d37 not found: ID does not exist" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.096367 4962 scope.go:117] "RemoveContainer" containerID="d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9" Feb 20 10:50:07 crc kubenswrapper[4962]: E0220 10:50:07.096695 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9\": container with ID starting with d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9 not found: ID does not exist" containerID="d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.096730 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9"} err="failed to get container status \"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9\": rpc error: code = NotFound desc = could not find container \"d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9\": container with ID starting with d85489c0617a9bd5656c454bcdd7efb60267c76d6767595e90e52c56969158b9 not found: ID does not exist" Feb 20 10:50:07 crc kubenswrapper[4962]: I0220 10:50:07.156438 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" path="/var/lib/kubelet/pods/79f06592-8c43-4e8e-95aa-0d2ec94e7cfb/volumes" Feb 20 10:50:41 crc kubenswrapper[4962]: I0220 10:50:41.508309 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:50:41 crc kubenswrapper[4962]: I0220 10:50:41.509072 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:51:11 crc kubenswrapper[4962]: I0220 10:51:11.508062 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:51:11 crc kubenswrapper[4962]: I0220 10:51:11.510322 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.508687 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.509297 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.509361 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.510242 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.510338 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059" gracePeriod=600 Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.889792 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059" exitCode=0 Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.889868 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059"} Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.890193 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5"} Feb 20 10:51:41 crc kubenswrapper[4962]: I0220 10:51:41.890237 4962 scope.go:117] "RemoveContainer" containerID="43438c422a1202ff4cdb2cf954680ba8059efb5a5642f4ef4d5d6cf5618f44c6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.669777 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:31 crc kubenswrapper[4962]: E0220 10:53:31.670852 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="extract-utilities" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.670876 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="extract-utilities" Feb 20 10:53:31 crc kubenswrapper[4962]: E0220 10:53:31.670903 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="registry-server" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.670916 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="registry-server" Feb 20 10:53:31 crc kubenswrapper[4962]: E0220 10:53:31.670968 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="extract-content" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.670981 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="extract-content" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.671235 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f06592-8c43-4e8e-95aa-0d2ec94e7cfb" containerName="registry-server" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.673038 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.683582 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.816745 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.816804 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.817156 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.918920 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.919009 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.919056 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.920870 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.920902 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:31 crc kubenswrapper[4962]: I0220 10:53:31.951572 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") pod \"certified-operators-4twc6\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.000002 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.507380 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.951474 4962 generic.go:334] "Generic (PLEG): container finished" podID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerID="b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40" exitCode=0 Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.951524 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerDied","Data":"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40"} Feb 20 10:53:32 crc kubenswrapper[4962]: I0220 10:53:32.951552 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerStarted","Data":"4abdc1336b017d4b427cff40cec7fac8ae13ddc593821c53d5d721bfac421ae3"} Feb 20 10:53:33 crc kubenswrapper[4962]: I0220 10:53:33.968733 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerStarted","Data":"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87"} Feb 20 10:53:34 crc kubenswrapper[4962]: I0220 10:53:34.981238 4962 generic.go:334] "Generic (PLEG): container finished" podID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerID="d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87" exitCode=0 Feb 20 10:53:34 crc kubenswrapper[4962]: I0220 10:53:34.981305 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerDied","Data":"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87"} Feb 20 10:53:35 crc kubenswrapper[4962]: I0220 10:53:35.989021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerStarted","Data":"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4"} Feb 20 10:53:36 crc kubenswrapper[4962]: I0220 10:53:36.011496 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4twc6" podStartSLOduration=2.603251475 podStartE2EDuration="5.011471871s" podCreationTimestamp="2026-02-20 10:53:31 +0000 UTC" firstStartedPulling="2026-02-20 10:53:32.953103628 +0000 UTC m=+3504.535575474" lastFinishedPulling="2026-02-20 10:53:35.361323984 +0000 UTC m=+3506.943795870" observedRunningTime="2026-02-20 10:53:36.007464831 +0000 UTC m=+3507.589936677" watchObservedRunningTime="2026-02-20 10:53:36.011471871 +0000 UTC m=+3507.593943757" Feb 20 10:53:41 crc kubenswrapper[4962]: I0220 10:53:41.508807 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:53:41 crc kubenswrapper[4962]: I0220 10:53:41.509662 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.000857 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.000918 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.079453 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.154695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:42 crc kubenswrapper[4962]: I0220 10:53:42.331437 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.053212 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4twc6" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="registry-server" containerID="cri-o://0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" gracePeriod=2 Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.640637 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.749955 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") pod \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.750094 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") pod \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.750136 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") pod \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\" (UID: \"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f\") " Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.751178 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities" (OuterVolumeSpecName: "utilities") pod "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" (UID: "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.758447 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j" (OuterVolumeSpecName: "kube-api-access-csc5j") pod "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" (UID: "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f"). InnerVolumeSpecName "kube-api-access-csc5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.805299 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" (UID: "c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.850926 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csc5j\" (UniqueName: \"kubernetes.io/projected/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-kube-api-access-csc5j\") on node \"crc\" DevicePath \"\"" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.850957 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:53:44 crc kubenswrapper[4962]: I0220 10:53:44.850967 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.065511 4962 generic.go:334] "Generic (PLEG): container finished" podID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerID="0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" exitCode=0 Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.066026 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerDied","Data":"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4"} Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.066067 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4twc6" event={"ID":"c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f","Type":"ContainerDied","Data":"4abdc1336b017d4b427cff40cec7fac8ae13ddc593821c53d5d721bfac421ae3"} Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.066097 4962 scope.go:117] "RemoveContainer" containerID="0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.066261 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4twc6" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.105066 4962 scope.go:117] "RemoveContainer" containerID="d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.127170 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.154175 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4twc6"] Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.156335 4962 scope.go:117] "RemoveContainer" containerID="b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.187094 4962 scope.go:117] "RemoveContainer" containerID="0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" Feb 20 10:53:45 crc kubenswrapper[4962]: E0220 10:53:45.188082 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4\": container with ID starting with 0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4 not found: ID does not exist" containerID="0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.188132 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4"} err="failed to get container status \"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4\": rpc error: code = NotFound desc = could not find container \"0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4\": container with ID starting with 0c979399bb383c0671518da1fadd571228eb863c79061bac7bad49c767a8eab4 not found: ID does not exist" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.188161 4962 scope.go:117] "RemoveContainer" containerID="d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87" Feb 20 10:53:45 crc kubenswrapper[4962]: E0220 10:53:45.188732 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87\": container with ID starting with d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87 not found: ID does not exist" containerID="d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.188761 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87"} err="failed to get container status \"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87\": rpc error: code = NotFound desc = could not find container \"d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87\": container with ID starting with d83d28477f9f3589f78caa1fffc6f5ac5836bec8906866b883c4244c8d2ded87 not found: ID does not exist" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.188773 4962 scope.go:117] "RemoveContainer" containerID="b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40" Feb 20 10:53:45 crc kubenswrapper[4962]: E0220 10:53:45.189087 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40\": container with ID starting with b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40 not found: ID does not exist" containerID="b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40" Feb 20 10:53:45 crc kubenswrapper[4962]: I0220 10:53:45.189139 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40"} err="failed to get container status \"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40\": rpc error: code = NotFound desc = could not find container \"b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40\": container with ID starting with b26136c1d324d87de0389eb7d3d8ff8d30cb8eb5c3269d58cb746e13dd7e2a40 not found: ID does not exist" Feb 20 10:53:47 crc kubenswrapper[4962]: I0220 10:53:47.155170 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" path="/var/lib/kubelet/pods/c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f/volumes" Feb 20 10:54:11 crc kubenswrapper[4962]: I0220 10:54:11.508517 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:54:11 crc kubenswrapper[4962]: I0220 10:54:11.509207 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.508721 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.511563 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.512075 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.513055 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:54:41 crc kubenswrapper[4962]: I0220 10:54:41.513314 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" gracePeriod=600 Feb 20 10:54:41 crc kubenswrapper[4962]: E0220 10:54:41.649379 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:54:42 crc kubenswrapper[4962]: I0220 10:54:42.594959 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" exitCode=0 Feb 20 10:54:42 crc kubenswrapper[4962]: I0220 10:54:42.595042 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5"} Feb 20 10:54:42 crc kubenswrapper[4962]: I0220 10:54:42.595100 4962 scope.go:117] "RemoveContainer" containerID="96520786fcd3eebb4c00d3ca8d282a9034e292ff58aa8bd50b4ba54603f3d059" Feb 20 10:54:42 crc kubenswrapper[4962]: I0220 10:54:42.595797 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:54:42 crc kubenswrapper[4962]: E0220 10:54:42.596389 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:54:54 crc kubenswrapper[4962]: I0220 10:54:54.139031 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:54:54 crc kubenswrapper[4962]: E0220 10:54:54.140238 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:07 crc kubenswrapper[4962]: I0220 10:55:07.139527 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:07 crc kubenswrapper[4962]: E0220 10:55:07.140575 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:19 crc kubenswrapper[4962]: I0220 10:55:19.167220 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:19 crc kubenswrapper[4962]: E0220 10:55:19.168446 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:30 crc kubenswrapper[4962]: I0220 10:55:30.139268 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:30 crc kubenswrapper[4962]: E0220 10:55:30.140307 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:41 crc kubenswrapper[4962]: I0220 10:55:41.139315 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:41 crc kubenswrapper[4962]: E0220 10:55:41.140322 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:55:56 crc kubenswrapper[4962]: I0220 10:55:56.139862 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:55:56 crc kubenswrapper[4962]: E0220 10:55:56.141042 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:09 crc kubenswrapper[4962]: I0220 10:56:09.146383 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:09 crc kubenswrapper[4962]: E0220 10:56:09.147354 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:22 crc kubenswrapper[4962]: I0220 10:56:22.139423 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:22 crc kubenswrapper[4962]: E0220 10:56:22.140708 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:34 crc kubenswrapper[4962]: I0220 10:56:34.138864 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:34 crc kubenswrapper[4962]: E0220 10:56:34.139687 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:47 crc kubenswrapper[4962]: I0220 10:56:47.139007 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:47 crc kubenswrapper[4962]: E0220 10:56:47.140098 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:56:58 crc kubenswrapper[4962]: I0220 10:56:58.138740 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:56:58 crc kubenswrapper[4962]: E0220 10:56:58.139731 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:57:12 crc kubenswrapper[4962]: I0220 10:57:12.139187 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:57:12 crc kubenswrapper[4962]: E0220 10:57:12.140326 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.655018 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:23 crc kubenswrapper[4962]: E0220 10:57:23.655984 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="extract-content" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.656006 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="extract-content" Feb 20 10:57:23 crc kubenswrapper[4962]: E0220 10:57:23.656039 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="extract-utilities" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.656051 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="extract-utilities" Feb 20 10:57:23 crc kubenswrapper[4962]: E0220 10:57:23.656086 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="registry-server" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.656099 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="registry-server" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.656334 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="c05203a7-d6ef-46bc-8180-e7a4bf9cdd9f" containerName="registry-server" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.657983 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.673136 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.804336 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.804449 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.804484 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906261 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906372 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906415 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906960 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.906980 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:23 crc kubenswrapper[4962]: I0220 10:57:23.941575 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") pod \"redhat-marketplace-f5f7g\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:24 crc kubenswrapper[4962]: I0220 10:57:24.002210 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:24 crc kubenswrapper[4962]: I0220 10:57:24.501775 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.074183 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerID="69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8" exitCode=0 Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.074297 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerDied","Data":"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8"} Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.074786 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerStarted","Data":"03fd792e4282a240ffeedaca88c255b25b9b927c2ad0f536473a6f08061f5e6e"} Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.077055 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:57:25 crc kubenswrapper[4962]: I0220 10:57:25.144167 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:57:25 crc kubenswrapper[4962]: E0220 10:57:25.144560 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:57:26 crc kubenswrapper[4962]: I0220 10:57:26.088926 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerStarted","Data":"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b"} Feb 20 10:57:27 crc kubenswrapper[4962]: I0220 10:57:27.102476 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerID="46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b" exitCode=0 Feb 20 10:57:27 crc kubenswrapper[4962]: I0220 10:57:27.102554 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerDied","Data":"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b"} Feb 20 10:57:28 crc kubenswrapper[4962]: I0220 10:57:28.115979 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerStarted","Data":"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a"} Feb 20 10:57:28 crc kubenswrapper[4962]: I0220 10:57:28.147167 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f5f7g" podStartSLOduration=2.714216917 podStartE2EDuration="5.147141359s" podCreationTimestamp="2026-02-20 10:57:23 +0000 UTC" firstStartedPulling="2026-02-20 10:57:25.076391545 +0000 UTC m=+3736.658863431" lastFinishedPulling="2026-02-20 10:57:27.509315987 +0000 UTC m=+3739.091787873" observedRunningTime="2026-02-20 10:57:28.143408864 +0000 UTC m=+3739.725880740" watchObservedRunningTime="2026-02-20 10:57:28.147141359 +0000 UTC m=+3739.729613235" Feb 20 10:57:34 crc kubenswrapper[4962]: I0220 10:57:34.002729 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:34 crc kubenswrapper[4962]: I0220 10:57:34.003451 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:34 crc kubenswrapper[4962]: I0220 10:57:34.083677 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:34 crc kubenswrapper[4962]: I0220 10:57:34.217626 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:37 crc kubenswrapper[4962]: I0220 10:57:37.629798 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:37 crc kubenswrapper[4962]: I0220 10:57:37.630433 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f5f7g" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="registry-server" containerID="cri-o://a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" gracePeriod=2 Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.029876 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.135337 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") pod \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.135480 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") pod \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.135517 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") pod \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\" (UID: \"bfdfb336-dce1-418b-9c2a-81c68335f4bf\") " Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.136414 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities" (OuterVolumeSpecName: "utilities") pod "bfdfb336-dce1-418b-9c2a-81c68335f4bf" (UID: "bfdfb336-dce1-418b-9c2a-81c68335f4bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.138915 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:57:38 crc kubenswrapper[4962]: E0220 10:57:38.139173 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.141921 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8" (OuterVolumeSpecName: "kube-api-access-vf2f8") pod "bfdfb336-dce1-418b-9c2a-81c68335f4bf" (UID: "bfdfb336-dce1-418b-9c2a-81c68335f4bf"). InnerVolumeSpecName "kube-api-access-vf2f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.165095 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfdfb336-dce1-418b-9c2a-81c68335f4bf" (UID: "bfdfb336-dce1-418b-9c2a-81c68335f4bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189058 4962 generic.go:334] "Generic (PLEG): container finished" podID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerID="a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" exitCode=0 Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189118 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerDied","Data":"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a"} Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189158 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5f7g" event={"ID":"bfdfb336-dce1-418b-9c2a-81c68335f4bf","Type":"ContainerDied","Data":"03fd792e4282a240ffeedaca88c255b25b9b927c2ad0f536473a6f08061f5e6e"} Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189184 4962 scope.go:117] "RemoveContainer" containerID="a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.189363 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5f7g" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.218383 4962 scope.go:117] "RemoveContainer" containerID="46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.235859 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.237571 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf2f8\" (UniqueName: \"kubernetes.io/projected/bfdfb336-dce1-418b-9c2a-81c68335f4bf-kube-api-access-vf2f8\") on node \"crc\" DevicePath \"\"" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.237640 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.237658 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfdfb336-dce1-418b-9c2a-81c68335f4bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.244731 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5f7g"] Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.272977 4962 scope.go:117] "RemoveContainer" containerID="69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.308607 4962 scope.go:117] "RemoveContainer" containerID="a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" Feb 20 10:57:38 crc kubenswrapper[4962]: E0220 10:57:38.309339 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a\": container with ID starting with a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a not found: ID does not exist" containerID="a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.309387 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a"} err="failed to get container status \"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a\": rpc error: code = NotFound desc = could not find container \"a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a\": container with ID starting with a842cca69dc66b5bdfb36f6718aa187fd5a76477e744c4ce7bbeee359092794a not found: ID does not exist" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.309420 4962 scope.go:117] "RemoveContainer" containerID="46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b" Feb 20 10:57:38 crc kubenswrapper[4962]: E0220 10:57:38.310448 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b\": container with ID starting with 46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b not found: ID does not exist" containerID="46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.310511 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b"} err="failed to get container status \"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b\": rpc error: code = NotFound desc = could not find container \"46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b\": container with ID starting with 46d61f46d87052353754b99520c9bd775c7f7ef31112390efef902af421b174b not found: ID does not exist" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.310545 4962 scope.go:117] "RemoveContainer" containerID="69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8" Feb 20 10:57:38 crc kubenswrapper[4962]: E0220 10:57:38.311959 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8\": container with ID starting with 69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8 not found: ID does not exist" containerID="69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8" Feb 20 10:57:38 crc kubenswrapper[4962]: I0220 10:57:38.312100 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8"} err="failed to get container status \"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8\": rpc error: code = NotFound desc = could not find container \"69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8\": container with ID starting with 69fc6397faaa57d376e1d51974c60dbff502ff05c3fb18985dbd7df3f1aa84d8 not found: ID does not exist" Feb 20 10:57:39 crc kubenswrapper[4962]: I0220 10:57:39.155282 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" path="/var/lib/kubelet/pods/bfdfb336-dce1-418b-9c2a-81c68335f4bf/volumes" Feb 20 10:57:53 crc kubenswrapper[4962]: I0220 10:57:53.138917 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:57:53 crc kubenswrapper[4962]: E0220 10:57:53.139807 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:58:07 crc kubenswrapper[4962]: I0220 10:58:07.139466 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:58:07 crc kubenswrapper[4962]: E0220 10:58:07.140566 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:58:20 crc kubenswrapper[4962]: I0220 10:58:20.167564 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:58:20 crc kubenswrapper[4962]: E0220 10:58:20.168912 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:58:32 crc kubenswrapper[4962]: I0220 10:58:32.138844 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:58:32 crc kubenswrapper[4962]: E0220 10:58:32.139818 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:58:46 crc kubenswrapper[4962]: I0220 10:58:46.139445 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:58:46 crc kubenswrapper[4962]: E0220 10:58:46.140403 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:59:01 crc kubenswrapper[4962]: I0220 10:59:01.139568 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:59:01 crc kubenswrapper[4962]: E0220 10:59:01.140441 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:59:14 crc kubenswrapper[4962]: I0220 10:59:14.139347 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:59:14 crc kubenswrapper[4962]: E0220 10:59:14.140998 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:59:28 crc kubenswrapper[4962]: I0220 10:59:28.138927 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:59:28 crc kubenswrapper[4962]: E0220 10:59:28.139986 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.952568 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 10:59:41 crc kubenswrapper[4962]: E0220 10:59:41.953900 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="extract-content" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.953927 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="extract-content" Feb 20 10:59:41 crc kubenswrapper[4962]: E0220 10:59:41.953952 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="registry-server" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.953968 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="registry-server" Feb 20 10:59:41 crc kubenswrapper[4962]: E0220 10:59:41.954020 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="extract-utilities" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.954039 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="extract-utilities" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.954341 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfdfb336-dce1-418b-9c2a-81c68335f4bf" containerName="registry-server" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.955980 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:41 crc kubenswrapper[4962]: I0220 10:59:41.968987 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.085849 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.085930 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.086124 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187178 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187235 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187272 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.187855 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.208130 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") pod \"redhat-operators-cq2nm\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.285374 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:42 crc kubenswrapper[4962]: I0220 10:59:42.719672 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.138955 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.318392 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b"} Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.319799 4962 generic.go:334] "Generic (PLEG): container finished" podID="84d46def-f006-4801-a633-a88796c6dc6b" containerID="bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966" exitCode=0 Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.319845 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerDied","Data":"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966"} Feb 20 10:59:43 crc kubenswrapper[4962]: I0220 10:59:43.319875 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerStarted","Data":"d786add5122e41426cb69b4c43033a6b7bc34c4cb08c9a68988cdfc0850b6c3b"} Feb 20 10:59:45 crc kubenswrapper[4962]: I0220 10:59:45.341518 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerStarted","Data":"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa"} Feb 20 10:59:46 crc kubenswrapper[4962]: I0220 10:59:46.353549 4962 generic.go:334] "Generic (PLEG): container finished" podID="84d46def-f006-4801-a633-a88796c6dc6b" containerID="189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa" exitCode=0 Feb 20 10:59:46 crc kubenswrapper[4962]: I0220 10:59:46.353681 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerDied","Data":"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa"} Feb 20 10:59:47 crc kubenswrapper[4962]: I0220 10:59:47.370177 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerStarted","Data":"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12"} Feb 20 10:59:47 crc kubenswrapper[4962]: I0220 10:59:47.408177 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cq2nm" podStartSLOduration=2.976510471 podStartE2EDuration="6.408150966s" podCreationTimestamp="2026-02-20 10:59:41 +0000 UTC" firstStartedPulling="2026-02-20 10:59:43.321072129 +0000 UTC m=+3874.903543985" lastFinishedPulling="2026-02-20 10:59:46.752712594 +0000 UTC m=+3878.335184480" observedRunningTime="2026-02-20 10:59:47.391859966 +0000 UTC m=+3878.974331842" watchObservedRunningTime="2026-02-20 10:59:47.408150966 +0000 UTC m=+3878.990622852" Feb 20 10:59:52 crc kubenswrapper[4962]: I0220 10:59:52.285655 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:52 crc kubenswrapper[4962]: I0220 10:59:52.286364 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.344187 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cq2nm" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" probeResult="failure" output=< Feb 20 10:59:53 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 10:59:53 crc kubenswrapper[4962]: > Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.395307 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.397799 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.419171 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.563809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.563887 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.563910 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665178 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665281 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.665860 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.692201 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") pod \"community-operators-mr5wb\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:53 crc kubenswrapper[4962]: I0220 10:59:53.720524 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 10:59:54 crc kubenswrapper[4962]: I0220 10:59:54.413720 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 10:59:54 crc kubenswrapper[4962]: W0220 10:59:54.419566 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e53575b_37ab_4c46_be3f_6ac873a2a9d0.slice/crio-bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff WatchSource:0}: Error finding container bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff: Status 404 returned error can't find the container with id bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff Feb 20 10:59:54 crc kubenswrapper[4962]: I0220 10:59:54.442295 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerStarted","Data":"bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff"} Feb 20 10:59:55 crc kubenswrapper[4962]: I0220 10:59:55.454587 4962 generic.go:334] "Generic (PLEG): container finished" podID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerID="9c0af593b4b43f8781c7a5aba922acad47038b25f5521623f3ea1762a03d3532" exitCode=0 Feb 20 10:59:55 crc kubenswrapper[4962]: I0220 10:59:55.454762 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerDied","Data":"9c0af593b4b43f8781c7a5aba922acad47038b25f5521623f3ea1762a03d3532"} Feb 20 10:59:56 crc kubenswrapper[4962]: I0220 10:59:56.467905 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerStarted","Data":"02a91fcfd5ea9a6b32532b3d0e1f6e6d142e678ce53b20f778f4d99dcb76e30e"} Feb 20 10:59:57 crc kubenswrapper[4962]: I0220 10:59:57.480335 4962 generic.go:334] "Generic (PLEG): container finished" podID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerID="02a91fcfd5ea9a6b32532b3d0e1f6e6d142e678ce53b20f778f4d99dcb76e30e" exitCode=0 Feb 20 10:59:57 crc kubenswrapper[4962]: I0220 10:59:57.480486 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerDied","Data":"02a91fcfd5ea9a6b32532b3d0e1f6e6d142e678ce53b20f778f4d99dcb76e30e"} Feb 20 10:59:58 crc kubenswrapper[4962]: I0220 10:59:58.492137 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerStarted","Data":"1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6"} Feb 20 10:59:58 crc kubenswrapper[4962]: I0220 10:59:58.523723 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mr5wb" podStartSLOduration=2.972382537 podStartE2EDuration="5.52370061s" podCreationTimestamp="2026-02-20 10:59:53 +0000 UTC" firstStartedPulling="2026-02-20 10:59:55.459517748 +0000 UTC m=+3887.041989634" lastFinishedPulling="2026-02-20 10:59:58.010835831 +0000 UTC m=+3889.593307707" observedRunningTime="2026-02-20 10:59:58.518058398 +0000 UTC m=+3890.100530294" watchObservedRunningTime="2026-02-20 10:59:58.52370061 +0000 UTC m=+3890.106172486" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.212427 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd"] Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.214779 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.216932 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.217588 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.224134 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd"] Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.370829 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.371217 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.371454 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.473044 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.473144 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.473182 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.474931 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.482782 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.505509 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") pod \"collect-profiles-29526420-vmnfd\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.540703 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:00 crc kubenswrapper[4962]: I0220 11:00:00.799726 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd"] Feb 20 11:00:01 crc kubenswrapper[4962]: I0220 11:00:01.522524 4962 generic.go:334] "Generic (PLEG): container finished" podID="444fe6ec-91a1-4572-9cc7-59f9848bd957" containerID="0417c10167fec51451355a4b52c16cd2e8025894a6838915d3bce249a3562e11" exitCode=0 Feb 20 11:00:01 crc kubenswrapper[4962]: I0220 11:00:01.522649 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" event={"ID":"444fe6ec-91a1-4572-9cc7-59f9848bd957","Type":"ContainerDied","Data":"0417c10167fec51451355a4b52c16cd2e8025894a6838915d3bce249a3562e11"} Feb 20 11:00:01 crc kubenswrapper[4962]: I0220 11:00:01.523021 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" event={"ID":"444fe6ec-91a1-4572-9cc7-59f9848bd957","Type":"ContainerStarted","Data":"9d2297082c21d3fe2a03b7b950f9cb27dd771fe868117807120a4e92bf8369ec"} Feb 20 11:00:02 crc kubenswrapper[4962]: I0220 11:00:02.363631 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 11:00:02 crc kubenswrapper[4962]: I0220 11:00:02.441909 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 11:00:02 crc kubenswrapper[4962]: I0220 11:00:02.621748 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 11:00:02 crc kubenswrapper[4962]: I0220 11:00:02.873133 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.015073 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") pod \"444fe6ec-91a1-4572-9cc7-59f9848bd957\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.015184 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") pod \"444fe6ec-91a1-4572-9cc7-59f9848bd957\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.015522 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") pod \"444fe6ec-91a1-4572-9cc7-59f9848bd957\" (UID: \"444fe6ec-91a1-4572-9cc7-59f9848bd957\") " Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.016252 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume" (OuterVolumeSpecName: "config-volume") pod "444fe6ec-91a1-4572-9cc7-59f9848bd957" (UID: "444fe6ec-91a1-4572-9cc7-59f9848bd957"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.022622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc" (OuterVolumeSpecName: "kube-api-access-xb2cc") pod "444fe6ec-91a1-4572-9cc7-59f9848bd957" (UID: "444fe6ec-91a1-4572-9cc7-59f9848bd957"). InnerVolumeSpecName "kube-api-access-xb2cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.022852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "444fe6ec-91a1-4572-9cc7-59f9848bd957" (UID: "444fe6ec-91a1-4572-9cc7-59f9848bd957"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.117739 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/444fe6ec-91a1-4572-9cc7-59f9848bd957-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.117781 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb2cc\" (UniqueName: \"kubernetes.io/projected/444fe6ec-91a1-4572-9cc7-59f9848bd957-kube-api-access-xb2cc\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.117797 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/444fe6ec-91a1-4572-9cc7-59f9848bd957-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.545747 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" event={"ID":"444fe6ec-91a1-4572-9cc7-59f9848bd957","Type":"ContainerDied","Data":"9d2297082c21d3fe2a03b7b950f9cb27dd771fe868117807120a4e92bf8369ec"} Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.545822 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d2297082c21d3fe2a03b7b950f9cb27dd771fe868117807120a4e92bf8369ec" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.545858 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cq2nm" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" containerID="cri-o://186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" gracePeriod=2 Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.545957 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526420-vmnfd" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.721443 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.721523 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.788518 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.965344 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 11:00:03 crc kubenswrapper[4962]: I0220 11:00:03.972033 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-bzqgc"] Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.019843 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.143851 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") pod \"84d46def-f006-4801-a633-a88796c6dc6b\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.145117 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities" (OuterVolumeSpecName: "utilities") pod "84d46def-f006-4801-a633-a88796c6dc6b" (UID: "84d46def-f006-4801-a633-a88796c6dc6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.145350 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") pod \"84d46def-f006-4801-a633-a88796c6dc6b\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.145420 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") pod \"84d46def-f006-4801-a633-a88796c6dc6b\" (UID: \"84d46def-f006-4801-a633-a88796c6dc6b\") " Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.150284 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.150852 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch" (OuterVolumeSpecName: "kube-api-access-pkzch") pod "84d46def-f006-4801-a633-a88796c6dc6b" (UID: "84d46def-f006-4801-a633-a88796c6dc6b"). InnerVolumeSpecName "kube-api-access-pkzch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.251751 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkzch\" (UniqueName: \"kubernetes.io/projected/84d46def-f006-4801-a633-a88796c6dc6b-kube-api-access-pkzch\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.364152 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84d46def-f006-4801-a633-a88796c6dc6b" (UID: "84d46def-f006-4801-a633-a88796c6dc6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.454818 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84d46def-f006-4801-a633-a88796c6dc6b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562783 4962 generic.go:334] "Generic (PLEG): container finished" podID="84d46def-f006-4801-a633-a88796c6dc6b" containerID="186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" exitCode=0 Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562863 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerDied","Data":"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12"} Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562946 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cq2nm" event={"ID":"84d46def-f006-4801-a633-a88796c6dc6b","Type":"ContainerDied","Data":"d786add5122e41426cb69b4c43033a6b7bc34c4cb08c9a68988cdfc0850b6c3b"} Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562979 4962 scope.go:117] "RemoveContainer" containerID="186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.562888 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cq2nm" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.592882 4962 scope.go:117] "RemoveContainer" containerID="189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.623477 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.633186 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cq2nm"] Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.648355 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.664250 4962 scope.go:117] "RemoveContainer" containerID="bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.697655 4962 scope.go:117] "RemoveContainer" containerID="186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" Feb 20 11:00:04 crc kubenswrapper[4962]: E0220 11:00:04.698312 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12\": container with ID starting with 186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12 not found: ID does not exist" containerID="186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.698389 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12"} err="failed to get container status \"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12\": rpc error: code = NotFound desc = could not find container \"186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12\": container with ID starting with 186cd446df2a0012baab13ab6615e8ba7905da41da7f996619013fd2760f1e12 not found: ID does not exist" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.698433 4962 scope.go:117] "RemoveContainer" containerID="189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa" Feb 20 11:00:04 crc kubenswrapper[4962]: E0220 11:00:04.698890 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa\": container with ID starting with 189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa not found: ID does not exist" containerID="189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.698933 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa"} err="failed to get container status \"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa\": rpc error: code = NotFound desc = could not find container \"189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa\": container with ID starting with 189ba151fb0668f00b7b94bc392c686ef62f6c29dd9e464a53119429ac037caa not found: ID does not exist" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.698988 4962 scope.go:117] "RemoveContainer" containerID="bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966" Feb 20 11:00:04 crc kubenswrapper[4962]: E0220 11:00:04.699442 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966\": container with ID starting with bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966 not found: ID does not exist" containerID="bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966" Feb 20 11:00:04 crc kubenswrapper[4962]: I0220 11:00:04.699492 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966"} err="failed to get container status \"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966\": rpc error: code = NotFound desc = could not find container \"bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966\": container with ID starting with bb24dad9128abc8a732cdcebb8543b69537d7687731766c3edb5952a3716d966 not found: ID does not exist" Feb 20 11:00:05 crc kubenswrapper[4962]: I0220 11:00:05.156857 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d46def-f006-4801-a633-a88796c6dc6b" path="/var/lib/kubelet/pods/84d46def-f006-4801-a633-a88796c6dc6b/volumes" Feb 20 11:00:05 crc kubenswrapper[4962]: I0220 11:00:05.159142 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9c6a80-7747-461e-8f29-f371984a8c95" path="/var/lib/kubelet/pods/fc9c6a80-7747-461e-8f29-f371984a8c95/volumes" Feb 20 11:00:05 crc kubenswrapper[4962]: I0220 11:00:05.822260 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 11:00:06 crc kubenswrapper[4962]: I0220 11:00:06.587340 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mr5wb" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="registry-server" containerID="cri-o://1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6" gracePeriod=2 Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.603847 4962 generic.go:334] "Generic (PLEG): container finished" podID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerID="1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6" exitCode=0 Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.603949 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerDied","Data":"1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6"} Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.604256 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr5wb" event={"ID":"4e53575b-37ab-4c46-be3f-6ac873a2a9d0","Type":"ContainerDied","Data":"bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff"} Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.604282 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba85929d1fbcecf2cd28f3b0ee87d31d833e938d84158e0076aab5955fa74ff" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.617941 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.626951 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") pod \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.627039 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") pod \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.627075 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") pod \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\" (UID: \"4e53575b-37ab-4c46-be3f-6ac873a2a9d0\") " Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.629273 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities" (OuterVolumeSpecName: "utilities") pod "4e53575b-37ab-4c46-be3f-6ac873a2a9d0" (UID: "4e53575b-37ab-4c46-be3f-6ac873a2a9d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.635737 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q" (OuterVolumeSpecName: "kube-api-access-nvq4q") pod "4e53575b-37ab-4c46-be3f-6ac873a2a9d0" (UID: "4e53575b-37ab-4c46-be3f-6ac873a2a9d0"). InnerVolumeSpecName "kube-api-access-nvq4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.723951 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e53575b-37ab-4c46-be3f-6ac873a2a9d0" (UID: "4e53575b-37ab-4c46-be3f-6ac873a2a9d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.729347 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.729423 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvq4q\" (UniqueName: \"kubernetes.io/projected/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-kube-api-access-nvq4q\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:07 crc kubenswrapper[4962]: I0220 11:00:07.729465 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e53575b-37ab-4c46-be3f-6ac873a2a9d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:00:08 crc kubenswrapper[4962]: I0220 11:00:08.612098 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr5wb" Feb 20 11:00:08 crc kubenswrapper[4962]: I0220 11:00:08.664053 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 11:00:08 crc kubenswrapper[4962]: I0220 11:00:08.692175 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mr5wb"] Feb 20 11:00:09 crc kubenswrapper[4962]: I0220 11:00:09.162504 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" path="/var/lib/kubelet/pods/4e53575b-37ab-4c46-be3f-6ac873a2a9d0/volumes" Feb 20 11:00:16 crc kubenswrapper[4962]: I0220 11:00:16.681438 4962 scope.go:117] "RemoveContainer" containerID="c54639681debdeffda54130d89e4883eb7658c42414168fa95eba4479a2f093f" Feb 20 11:02:11 crc kubenswrapper[4962]: I0220 11:02:11.508285 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:02:11 crc kubenswrapper[4962]: I0220 11:02:11.508995 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:02:41 crc kubenswrapper[4962]: I0220 11:02:41.507984 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:02:41 crc kubenswrapper[4962]: I0220 11:02:41.508667 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.508288 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.509038 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.509112 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.510018 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:03:11 crc kubenswrapper[4962]: I0220 11:03:11.510144 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b" gracePeriod=600 Feb 20 11:03:12 crc kubenswrapper[4962]: I0220 11:03:12.299559 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b" exitCode=0 Feb 20 11:03:12 crc kubenswrapper[4962]: I0220 11:03:12.299693 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b"} Feb 20 11:03:12 crc kubenswrapper[4962]: I0220 11:03:12.299991 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b"} Feb 20 11:03:12 crc kubenswrapper[4962]: I0220 11:03:12.300016 4962 scope.go:117] "RemoveContainer" containerID="352c11f5a21be22d0a2a8db4a2977485f1b5ed88f19194a79937d4da6776e7b5" Feb 20 11:05:11 crc kubenswrapper[4962]: I0220 11:05:11.507891 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:05:11 crc kubenswrapper[4962]: I0220 11:05:11.508880 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.383729 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tzxjg"] Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385154 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385186 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385212 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="extract-utilities" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385229 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="extract-utilities" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385417 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="extract-content" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385435 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="extract-content" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385467 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385482 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385509 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444fe6ec-91a1-4572-9cc7-59f9848bd957" containerName="collect-profiles" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385525 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="444fe6ec-91a1-4572-9cc7-59f9848bd957" containerName="collect-profiles" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385553 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="extract-utilities" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385569 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="extract-utilities" Feb 20 11:05:31 crc kubenswrapper[4962]: E0220 11:05:31.385639 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="extract-content" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385656 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="extract-content" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.385981 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="444fe6ec-91a1-4572-9cc7-59f9848bd957" containerName="collect-profiles" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.386027 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d46def-f006-4801-a633-a88796c6dc6b" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.386052 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e53575b-37ab-4c46-be3f-6ac873a2a9d0" containerName="registry-server" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.388342 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.403941 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzxjg"] Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.484925 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4mjm\" (UniqueName: \"kubernetes.io/projected/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-kube-api-access-p4mjm\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.485001 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-catalog-content\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.485073 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-utilities\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.586973 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mjm\" (UniqueName: \"kubernetes.io/projected/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-kube-api-access-p4mjm\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.587029 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-catalog-content\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.587057 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-utilities\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.587719 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-utilities\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.587799 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-catalog-content\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.609910 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mjm\" (UniqueName: \"kubernetes.io/projected/f21d4aaf-2f5d-4576-a1e1-b8c233e285f1-kube-api-access-p4mjm\") pod \"certified-operators-tzxjg\" (UID: \"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1\") " pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:31 crc kubenswrapper[4962]: I0220 11:05:31.720455 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.234628 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzxjg"] Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.562220 4962 generic.go:334] "Generic (PLEG): container finished" podID="f21d4aaf-2f5d-4576-a1e1-b8c233e285f1" containerID="dcb86e3eec5ec0c3d8c242ccea873e40fc4246d7d95f6d3de2d2989d0bc5d1d9" exitCode=0 Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.562313 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerDied","Data":"dcb86e3eec5ec0c3d8c242ccea873e40fc4246d7d95f6d3de2d2989d0bc5d1d9"} Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.562422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerStarted","Data":"a2e4d3ebe9ffa77afb7c06a23d865e75f7c83845a11917eeb9f33f455fc0e38b"} Feb 20 11:05:32 crc kubenswrapper[4962]: I0220 11:05:32.564661 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 11:05:36 crc kubenswrapper[4962]: I0220 11:05:36.596908 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerStarted","Data":"52b89b420f7ad1845f0c6551a2469ff5f038736c139b94a10b318273d8f78738"} Feb 20 11:05:37 crc kubenswrapper[4962]: I0220 11:05:37.608990 4962 generic.go:334] "Generic (PLEG): container finished" podID="f21d4aaf-2f5d-4576-a1e1-b8c233e285f1" containerID="52b89b420f7ad1845f0c6551a2469ff5f038736c139b94a10b318273d8f78738" exitCode=0 Feb 20 11:05:37 crc kubenswrapper[4962]: I0220 11:05:37.609117 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerDied","Data":"52b89b420f7ad1845f0c6551a2469ff5f038736c139b94a10b318273d8f78738"} Feb 20 11:05:38 crc kubenswrapper[4962]: I0220 11:05:38.621436 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxjg" event={"ID":"f21d4aaf-2f5d-4576-a1e1-b8c233e285f1","Type":"ContainerStarted","Data":"9d52fba61158f0e1cb7806757e0716b4eac9df35c05b2133602134586966576a"} Feb 20 11:05:38 crc kubenswrapper[4962]: I0220 11:05:38.649131 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tzxjg" podStartSLOduration=2.172155068 podStartE2EDuration="7.649110077s" podCreationTimestamp="2026-02-20 11:05:31 +0000 UTC" firstStartedPulling="2026-02-20 11:05:32.564208867 +0000 UTC m=+4224.146680753" lastFinishedPulling="2026-02-20 11:05:38.041163876 +0000 UTC m=+4229.623635762" observedRunningTime="2026-02-20 11:05:38.647433566 +0000 UTC m=+4230.229905432" watchObservedRunningTime="2026-02-20 11:05:38.649110077 +0000 UTC m=+4230.231581943" Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.507717 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.508668 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.721051 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.721120 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:41 crc kubenswrapper[4962]: I0220 11:05:41.790332 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:51 crc kubenswrapper[4962]: I0220 11:05:51.790540 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tzxjg" Feb 20 11:05:51 crc kubenswrapper[4962]: I0220 11:05:51.904498 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzxjg"] Feb 20 11:05:51 crc kubenswrapper[4962]: I0220 11:05:51.966707 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 11:05:51 crc kubenswrapper[4962]: I0220 11:05:51.967084 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v7zlm" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="registry-server" containerID="cri-o://89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab" gracePeriod=2 Feb 20 11:05:52 crc kubenswrapper[4962]: I0220 11:05:52.758037 4962 generic.go:334] "Generic (PLEG): container finished" podID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerID="89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab" exitCode=0 Feb 20 11:05:52 crc kubenswrapper[4962]: I0220 11:05:52.758729 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerDied","Data":"89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab"} Feb 20 11:05:52 crc kubenswrapper[4962]: I0220 11:05:52.859428 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.059911 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") pod \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.059972 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") pod \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.060040 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") pod \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\" (UID: \"a63e8904-d4b9-405f-94a1-f44cb565b3e7\") " Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.060606 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities" (OuterVolumeSpecName: "utilities") pod "a63e8904-d4b9-405f-94a1-f44cb565b3e7" (UID: "a63e8904-d4b9-405f-94a1-f44cb565b3e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.066918 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr" (OuterVolumeSpecName: "kube-api-access-66gmr") pod "a63e8904-d4b9-405f-94a1-f44cb565b3e7" (UID: "a63e8904-d4b9-405f-94a1-f44cb565b3e7"). InnerVolumeSpecName "kube-api-access-66gmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.102926 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a63e8904-d4b9-405f-94a1-f44cb565b3e7" (UID: "a63e8904-d4b9-405f-94a1-f44cb565b3e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.161828 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66gmr\" (UniqueName: \"kubernetes.io/projected/a63e8904-d4b9-405f-94a1-f44cb565b3e7-kube-api-access-66gmr\") on node \"crc\" DevicePath \"\"" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.162122 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.162213 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a63e8904-d4b9-405f-94a1-f44cb565b3e7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.776513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7zlm" event={"ID":"a63e8904-d4b9-405f-94a1-f44cb565b3e7","Type":"ContainerDied","Data":"8a31164fd2d499337255d6b6c8ff41059d39c8ca1c0d9b36dc4180dcc63a2f70"} Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.776636 4962 scope.go:117] "RemoveContainer" containerID="89e68b25becf346a76d704d66f2fa088754a410ce947cc372fb175cd6ff921ab" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.776870 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7zlm" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.814321 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.822330 4962 scope.go:117] "RemoveContainer" containerID="4a3a18f226977c3365d615b122597c40567fbf4037342852763a1652d9c44e94" Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.847034 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v7zlm"] Feb 20 11:05:53 crc kubenswrapper[4962]: I0220 11:05:53.866887 4962 scope.go:117] "RemoveContainer" containerID="0dadcfbc9540e03300cad39a3785cea06d52c326b71fdd2b10314f009df918de" Feb 20 11:05:55 crc kubenswrapper[4962]: I0220 11:05:55.158466 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" path="/var/lib/kubelet/pods/a63e8904-d4b9-405f-94a1-f44cb565b3e7/volumes" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.507860 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.508714 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.508802 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.509894 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.510004 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" gracePeriod=600 Feb 20 11:06:11 crc kubenswrapper[4962]: E0220 11:06:11.783839 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.962976 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" exitCode=0 Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.963046 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b"} Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.963097 4962 scope.go:117] "RemoveContainer" containerID="b13da16a083c998b7c489b8e1d193fb0bd68b351ea17cf6f4129bf436fc9bf7b" Feb 20 11:06:11 crc kubenswrapper[4962]: I0220 11:06:11.963986 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:06:11 crc kubenswrapper[4962]: E0220 11:06:11.964452 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:06:16 crc kubenswrapper[4962]: I0220 11:06:16.863413 4962 scope.go:117] "RemoveContainer" containerID="1f35394b405a7932de6ea65a40b90b4cfc16077b45269a93e08d860ad8ab92d6" Feb 20 11:06:16 crc kubenswrapper[4962]: I0220 11:06:16.890871 4962 scope.go:117] "RemoveContainer" containerID="9c0af593b4b43f8781c7a5aba922acad47038b25f5521623f3ea1762a03d3532" Feb 20 11:06:16 crc kubenswrapper[4962]: I0220 11:06:16.918039 4962 scope.go:117] "RemoveContainer" containerID="02a91fcfd5ea9a6b32532b3d0e1f6e6d142e678ce53b20f778f4d99dcb76e30e" Feb 20 11:06:23 crc kubenswrapper[4962]: I0220 11:06:23.138860 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:06:23 crc kubenswrapper[4962]: E0220 11:06:23.139689 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:06:34 crc kubenswrapper[4962]: I0220 11:06:34.139115 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:06:34 crc kubenswrapper[4962]: E0220 11:06:34.141048 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:06:46 crc kubenswrapper[4962]: I0220 11:06:46.139853 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:06:46 crc kubenswrapper[4962]: E0220 11:06:46.140840 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:01 crc kubenswrapper[4962]: I0220 11:07:01.139994 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:01 crc kubenswrapper[4962]: E0220 11:07:01.141332 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:16 crc kubenswrapper[4962]: I0220 11:07:16.138864 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:16 crc kubenswrapper[4962]: E0220 11:07:16.141038 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:31 crc kubenswrapper[4962]: I0220 11:07:31.138953 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:31 crc kubenswrapper[4962]: E0220 11:07:31.139926 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:43 crc kubenswrapper[4962]: I0220 11:07:43.139800 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:43 crc kubenswrapper[4962]: E0220 11:07:43.141782 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.871162 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:07:52 crc kubenswrapper[4962]: E0220 11:07:52.872125 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="extract-utilities" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.872140 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="extract-utilities" Feb 20 11:07:52 crc kubenswrapper[4962]: E0220 11:07:52.872160 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="registry-server" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.872170 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="registry-server" Feb 20 11:07:52 crc kubenswrapper[4962]: E0220 11:07:52.872181 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="extract-content" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.872189 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="extract-content" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.872365 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a63e8904-d4b9-405f-94a1-f44cb565b3e7" containerName="registry-server" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.873644 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:52 crc kubenswrapper[4962]: I0220 11:07:52.886154 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.059785 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.059849 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.059874 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.160699 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.160758 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.160781 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.161274 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.161529 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.184490 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") pod \"redhat-marketplace-78slw\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.195404 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.478134 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.869353 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerID="09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f" exitCode=0 Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.869480 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerDied","Data":"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f"} Feb 20 11:07:53 crc kubenswrapper[4962]: I0220 11:07:53.871556 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerStarted","Data":"b9dc7d0ae4328e3eafdc7473677bb84d1e13ba25cac490a152209bb921d94a90"} Feb 20 11:07:54 crc kubenswrapper[4962]: I0220 11:07:54.138943 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:07:54 crc kubenswrapper[4962]: E0220 11:07:54.139245 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:07:54 crc kubenswrapper[4962]: I0220 11:07:54.883471 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerStarted","Data":"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b"} Feb 20 11:07:55 crc kubenswrapper[4962]: I0220 11:07:55.895550 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerID="db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b" exitCode=0 Feb 20 11:07:55 crc kubenswrapper[4962]: I0220 11:07:55.895633 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerDied","Data":"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b"} Feb 20 11:07:57 crc kubenswrapper[4962]: I0220 11:07:57.915066 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerStarted","Data":"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd"} Feb 20 11:07:57 crc kubenswrapper[4962]: I0220 11:07:57.946400 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-78slw" podStartSLOduration=3.480047705 podStartE2EDuration="5.946381481s" podCreationTimestamp="2026-02-20 11:07:52 +0000 UTC" firstStartedPulling="2026-02-20 11:07:53.871345565 +0000 UTC m=+4365.453817421" lastFinishedPulling="2026-02-20 11:07:56.337679311 +0000 UTC m=+4367.920151197" observedRunningTime="2026-02-20 11:07:57.937731837 +0000 UTC m=+4369.520203723" watchObservedRunningTime="2026-02-20 11:07:57.946381481 +0000 UTC m=+4369.528853337" Feb 20 11:08:03 crc kubenswrapper[4962]: I0220 11:08:03.196537 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:03 crc kubenswrapper[4962]: I0220 11:08:03.197182 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:03 crc kubenswrapper[4962]: I0220 11:08:03.271898 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:04 crc kubenswrapper[4962]: I0220 11:08:04.040096 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:04 crc kubenswrapper[4962]: I0220 11:08:04.104801 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:08:05 crc kubenswrapper[4962]: I0220 11:08:05.139463 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:05 crc kubenswrapper[4962]: E0220 11:08:05.140087 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.000374 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-78slw" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="registry-server" containerID="cri-o://1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" gracePeriod=2 Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.411405 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.582124 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") pod \"b8f75fa9-60ea-40ba-861c-78dbce63f152\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.582465 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") pod \"b8f75fa9-60ea-40ba-861c-78dbce63f152\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.582628 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") pod \"b8f75fa9-60ea-40ba-861c-78dbce63f152\" (UID: \"b8f75fa9-60ea-40ba-861c-78dbce63f152\") " Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.584090 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities" (OuterVolumeSpecName: "utilities") pod "b8f75fa9-60ea-40ba-861c-78dbce63f152" (UID: "b8f75fa9-60ea-40ba-861c-78dbce63f152"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.590230 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n" (OuterVolumeSpecName: "kube-api-access-m5w7n") pod "b8f75fa9-60ea-40ba-861c-78dbce63f152" (UID: "b8f75fa9-60ea-40ba-861c-78dbce63f152"). InnerVolumeSpecName "kube-api-access-m5w7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.604078 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8f75fa9-60ea-40ba-861c-78dbce63f152" (UID: "b8f75fa9-60ea-40ba-861c-78dbce63f152"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.684159 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.684184 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f75fa9-60ea-40ba-861c-78dbce63f152-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:06 crc kubenswrapper[4962]: I0220 11:08:06.684197 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5w7n\" (UniqueName: \"kubernetes.io/projected/b8f75fa9-60ea-40ba-861c-78dbce63f152-kube-api-access-m5w7n\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014133 4962 generic.go:334] "Generic (PLEG): container finished" podID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerID="1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" exitCode=0 Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014190 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerDied","Data":"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd"} Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014239 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78slw" event={"ID":"b8f75fa9-60ea-40ba-861c-78dbce63f152","Type":"ContainerDied","Data":"b9dc7d0ae4328e3eafdc7473677bb84d1e13ba25cac490a152209bb921d94a90"} Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014246 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78slw" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.014270 4962 scope.go:117] "RemoveContainer" containerID="1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.045644 4962 scope.go:117] "RemoveContainer" containerID="db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.071544 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.083453 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-78slw"] Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.085829 4962 scope.go:117] "RemoveContainer" containerID="09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.125061 4962 scope.go:117] "RemoveContainer" containerID="1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" Feb 20 11:08:07 crc kubenswrapper[4962]: E0220 11:08:07.125776 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd\": container with ID starting with 1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd not found: ID does not exist" containerID="1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.125860 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd"} err="failed to get container status \"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd\": rpc error: code = NotFound desc = could not find container \"1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd\": container with ID starting with 1ff0c536ae6cf14ef09b27e02e79d2e4bfa9de23c1b0f79b02209ea9a0e7c4fd not found: ID does not exist" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.125904 4962 scope.go:117] "RemoveContainer" containerID="db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b" Feb 20 11:08:07 crc kubenswrapper[4962]: E0220 11:08:07.126643 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b\": container with ID starting with db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b not found: ID does not exist" containerID="db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.126702 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b"} err="failed to get container status \"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b\": rpc error: code = NotFound desc = could not find container \"db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b\": container with ID starting with db97a0adb93f1896a0aaa8ef2f72f09311bf746117bf23bb114bff80418d750b not found: ID does not exist" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.126743 4962 scope.go:117] "RemoveContainer" containerID="09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f" Feb 20 11:08:07 crc kubenswrapper[4962]: E0220 11:08:07.127212 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f\": container with ID starting with 09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f not found: ID does not exist" containerID="09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.127280 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f"} err="failed to get container status \"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f\": rpc error: code = NotFound desc = could not find container \"09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f\": container with ID starting with 09b8156f1591f7f4dcc5e7e9e44dd67e63366ef86038384bea90eac7f618f53f not found: ID does not exist" Feb 20 11:08:07 crc kubenswrapper[4962]: I0220 11:08:07.152242 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" path="/var/lib/kubelet/pods/b8f75fa9-60ea-40ba-861c-78dbce63f152/volumes" Feb 20 11:08:19 crc kubenswrapper[4962]: I0220 11:08:19.148663 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:19 crc kubenswrapper[4962]: E0220 11:08:19.150126 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:08:31 crc kubenswrapper[4962]: I0220 11:08:31.138896 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:31 crc kubenswrapper[4962]: E0220 11:08:31.141436 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.330897 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.340589 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-9v9g5"] Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496080 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:36 crc kubenswrapper[4962]: E0220 11:08:36.496382 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="extract-utilities" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496397 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="extract-utilities" Feb 20 11:08:36 crc kubenswrapper[4962]: E0220 11:08:36.496424 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="registry-server" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496434 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="registry-server" Feb 20 11:08:36 crc kubenswrapper[4962]: E0220 11:08:36.496464 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="extract-content" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496473 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="extract-content" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.496650 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f75fa9-60ea-40ba-861c-78dbce63f152" containerName="registry-server" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.497186 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.499768 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.500766 4962 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-bwxwq" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.505816 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.506025 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.517454 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.597810 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.597915 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.597984 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.699932 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.700058 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.700179 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.700414 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.701035 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.719826 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") pod \"crc-storage-crc-jqlcs\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:36 crc kubenswrapper[4962]: I0220 11:08:36.862093 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:37 crc kubenswrapper[4962]: I0220 11:08:37.129950 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:37 crc kubenswrapper[4962]: I0220 11:08:37.153879 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6423ea5e-20ed-4977-a842-2bc521939341" path="/var/lib/kubelet/pods/6423ea5e-20ed-4977-a842-2bc521939341/volumes" Feb 20 11:08:37 crc kubenswrapper[4962]: I0220 11:08:37.272820 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jqlcs" event={"ID":"661e27e5-1795-405b-af57-a6f0901b654e","Type":"ContainerStarted","Data":"8b05c1d5829bd42422a14fb650178b9eaa0bcb462ebc9138f3f91ed1ce433170"} Feb 20 11:08:38 crc kubenswrapper[4962]: I0220 11:08:38.282813 4962 generic.go:334] "Generic (PLEG): container finished" podID="661e27e5-1795-405b-af57-a6f0901b654e" containerID="0fe78591e142b00ce0a1305d692098cbd58316c27a1f913e16fe16879c83db51" exitCode=0 Feb 20 11:08:38 crc kubenswrapper[4962]: I0220 11:08:38.282893 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jqlcs" event={"ID":"661e27e5-1795-405b-af57-a6f0901b654e","Type":"ContainerDied","Data":"0fe78591e142b00ce0a1305d692098cbd58316c27a1f913e16fe16879c83db51"} Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.664812 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.751254 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") pod \"661e27e5-1795-405b-af57-a6f0901b654e\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.751726 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") pod \"661e27e5-1795-405b-af57-a6f0901b654e\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.751750 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") pod \"661e27e5-1795-405b-af57-a6f0901b654e\" (UID: \"661e27e5-1795-405b-af57-a6f0901b654e\") " Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.752016 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "661e27e5-1795-405b-af57-a6f0901b654e" (UID: "661e27e5-1795-405b-af57-a6f0901b654e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.758875 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9" (OuterVolumeSpecName: "kube-api-access-wmtm9") pod "661e27e5-1795-405b-af57-a6f0901b654e" (UID: "661e27e5-1795-405b-af57-a6f0901b654e"). InnerVolumeSpecName "kube-api-access-wmtm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.770277 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "661e27e5-1795-405b-af57-a6f0901b654e" (UID: "661e27e5-1795-405b-af57-a6f0901b654e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.853155 4962 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/661e27e5-1795-405b-af57-a6f0901b654e-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.853218 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmtm9\" (UniqueName: \"kubernetes.io/projected/661e27e5-1795-405b-af57-a6f0901b654e-kube-api-access-wmtm9\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:39 crc kubenswrapper[4962]: I0220 11:08:39.853239 4962 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/661e27e5-1795-405b-af57-a6f0901b654e-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:40 crc kubenswrapper[4962]: I0220 11:08:40.304405 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jqlcs" event={"ID":"661e27e5-1795-405b-af57-a6f0901b654e","Type":"ContainerDied","Data":"8b05c1d5829bd42422a14fb650178b9eaa0bcb462ebc9138f3f91ed1ce433170"} Feb 20 11:08:40 crc kubenswrapper[4962]: I0220 11:08:40.304467 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b05c1d5829bd42422a14fb650178b9eaa0bcb462ebc9138f3f91ed1ce433170" Feb 20 11:08:40 crc kubenswrapper[4962]: I0220 11:08:40.304508 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jqlcs" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.059832 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.070363 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jqlcs"] Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.139102 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:42 crc kubenswrapper[4962]: E0220 11:08:42.139665 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.227827 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ld4sw"] Feb 20 11:08:42 crc kubenswrapper[4962]: E0220 11:08:42.228133 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="661e27e5-1795-405b-af57-a6f0901b654e" containerName="storage" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.228147 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="661e27e5-1795-405b-af57-a6f0901b654e" containerName="storage" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.228309 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="661e27e5-1795-405b-af57-a6f0901b654e" containerName="storage" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.228807 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.232054 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.232285 4962 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-bwxwq" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.233828 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.233995 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.238283 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ld4sw"] Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.400671 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.400735 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.400762 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.501707 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.501948 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.502001 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.502441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.503019 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.530727 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") pod \"crc-storage-crc-ld4sw\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.545937 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:42 crc kubenswrapper[4962]: I0220 11:08:42.858312 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ld4sw"] Feb 20 11:08:43 crc kubenswrapper[4962]: I0220 11:08:43.153844 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="661e27e5-1795-405b-af57-a6f0901b654e" path="/var/lib/kubelet/pods/661e27e5-1795-405b-af57-a6f0901b654e/volumes" Feb 20 11:08:43 crc kubenswrapper[4962]: I0220 11:08:43.329513 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ld4sw" event={"ID":"b357a94b-d688-4db2-9693-581cf3d3a650","Type":"ContainerStarted","Data":"19b7abe0d5962436e029d44248d9735517a68c8b28538521d0159ccd5a176372"} Feb 20 11:08:44 crc kubenswrapper[4962]: I0220 11:08:44.341484 4962 generic.go:334] "Generic (PLEG): container finished" podID="b357a94b-d688-4db2-9693-581cf3d3a650" containerID="a3e551b0efed20f0f0350f061d20e398a0e2cf1b98045b42cb931ca25aaedfe9" exitCode=0 Feb 20 11:08:44 crc kubenswrapper[4962]: I0220 11:08:44.341575 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ld4sw" event={"ID":"b357a94b-d688-4db2-9693-581cf3d3a650","Type":"ContainerDied","Data":"a3e551b0efed20f0f0350f061d20e398a0e2cf1b98045b42cb931ca25aaedfe9"} Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.742303 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.786416 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") pod \"b357a94b-d688-4db2-9693-581cf3d3a650\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.786526 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") pod \"b357a94b-d688-4db2-9693-581cf3d3a650\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.786568 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") pod \"b357a94b-d688-4db2-9693-581cf3d3a650\" (UID: \"b357a94b-d688-4db2-9693-581cf3d3a650\") " Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.787220 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b357a94b-d688-4db2-9693-581cf3d3a650" (UID: "b357a94b-d688-4db2-9693-581cf3d3a650"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.794801 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96" (OuterVolumeSpecName: "kube-api-access-spx96") pod "b357a94b-d688-4db2-9693-581cf3d3a650" (UID: "b357a94b-d688-4db2-9693-581cf3d3a650"). InnerVolumeSpecName "kube-api-access-spx96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.809541 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b357a94b-d688-4db2-9693-581cf3d3a650" (UID: "b357a94b-d688-4db2-9693-581cf3d3a650"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.889001 4962 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b357a94b-d688-4db2-9693-581cf3d3a650-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.889046 4962 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b357a94b-d688-4db2-9693-581cf3d3a650-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:45 crc kubenswrapper[4962]: I0220 11:08:45.889071 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spx96\" (UniqueName: \"kubernetes.io/projected/b357a94b-d688-4db2-9693-581cf3d3a650-kube-api-access-spx96\") on node \"crc\" DevicePath \"\"" Feb 20 11:08:46 crc kubenswrapper[4962]: I0220 11:08:46.362968 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ld4sw" event={"ID":"b357a94b-d688-4db2-9693-581cf3d3a650","Type":"ContainerDied","Data":"19b7abe0d5962436e029d44248d9735517a68c8b28538521d0159ccd5a176372"} Feb 20 11:08:46 crc kubenswrapper[4962]: I0220 11:08:46.363025 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b7abe0d5962436e029d44248d9735517a68c8b28538521d0159ccd5a176372" Feb 20 11:08:46 crc kubenswrapper[4962]: I0220 11:08:46.363064 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ld4sw" Feb 20 11:08:53 crc kubenswrapper[4962]: I0220 11:08:53.148127 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:08:53 crc kubenswrapper[4962]: E0220 11:08:53.149238 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:05 crc kubenswrapper[4962]: I0220 11:09:05.139490 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:05 crc kubenswrapper[4962]: E0220 11:09:05.140851 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:17 crc kubenswrapper[4962]: I0220 11:09:17.056101 4962 scope.go:117] "RemoveContainer" containerID="dd81866a8883595a9a43e5321d2a1e397058906782cb6839d22126aa9d907feb" Feb 20 11:09:17 crc kubenswrapper[4962]: I0220 11:09:17.140363 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:17 crc kubenswrapper[4962]: E0220 11:09:17.140742 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:32 crc kubenswrapper[4962]: I0220 11:09:32.138842 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:32 crc kubenswrapper[4962]: E0220 11:09:32.139873 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:43 crc kubenswrapper[4962]: I0220 11:09:43.138935 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:43 crc kubenswrapper[4962]: E0220 11:09:43.140019 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:09:56 crc kubenswrapper[4962]: I0220 11:09:56.140458 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:09:56 crc kubenswrapper[4962]: E0220 11:09:56.143848 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:10:11 crc kubenswrapper[4962]: I0220 11:10:11.140127 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:10:11 crc kubenswrapper[4962]: E0220 11:10:11.141145 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:10:22 crc kubenswrapper[4962]: I0220 11:10:22.139013 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:10:22 crc kubenswrapper[4962]: E0220 11:10:22.139843 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:10:34 crc kubenswrapper[4962]: I0220 11:10:34.139740 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:10:34 crc kubenswrapper[4962]: E0220 11:10:34.141088 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:10:45 crc kubenswrapper[4962]: I0220 11:10:45.139498 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:10:45 crc kubenswrapper[4962]: E0220 11:10:45.140496 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:11:00 crc kubenswrapper[4962]: I0220 11:11:00.140010 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:11:00 crc kubenswrapper[4962]: E0220 11:11:00.141088 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:11:13 crc kubenswrapper[4962]: I0220 11:11:13.139832 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:11:13 crc kubenswrapper[4962]: I0220 11:11:13.723247 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26"} Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.086028 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:33 crc kubenswrapper[4962]: E0220 11:11:33.089816 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357a94b-d688-4db2-9693-581cf3d3a650" containerName="storage" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.089855 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357a94b-d688-4db2-9693-581cf3d3a650" containerName="storage" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.090153 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="b357a94b-d688-4db2-9693-581cf3d3a650" containerName="storage" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.091978 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.103111 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.260165 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.260258 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.260373 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.362198 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.362823 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.363543 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.363157 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.362859 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.389195 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") pod \"redhat-operators-gf65t\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.427311 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.678402 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:33 crc kubenswrapper[4962]: W0220 11:11:33.682844 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8123f297_3029_4d9d_a922_2b771aed43c0.slice/crio-f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480 WatchSource:0}: Error finding container f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480: Status 404 returned error can't find the container with id f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480 Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.895844 4962 generic.go:334] "Generic (PLEG): container finished" podID="8123f297-3029-4d9d-a922-2b771aed43c0" containerID="feabc8c4cf26db6fa376158de7d8e770dde59ef8215527032dd3ebcbccf03e3a" exitCode=0 Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.895891 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerDied","Data":"feabc8c4cf26db6fa376158de7d8e770dde59ef8215527032dd3ebcbccf03e3a"} Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.895914 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerStarted","Data":"f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480"} Feb 20 11:11:33 crc kubenswrapper[4962]: I0220 11:11:33.897500 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 11:11:34 crc kubenswrapper[4962]: I0220 11:11:34.907638 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerStarted","Data":"236bd57b1e0c8740564a579c282fdc2b610fbbfad7b4be819e1c8f8557919d31"} Feb 20 11:11:35 crc kubenswrapper[4962]: I0220 11:11:35.920396 4962 generic.go:334] "Generic (PLEG): container finished" podID="8123f297-3029-4d9d-a922-2b771aed43c0" containerID="236bd57b1e0c8740564a579c282fdc2b610fbbfad7b4be819e1c8f8557919d31" exitCode=0 Feb 20 11:11:35 crc kubenswrapper[4962]: I0220 11:11:35.920470 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerDied","Data":"236bd57b1e0c8740564a579c282fdc2b610fbbfad7b4be819e1c8f8557919d31"} Feb 20 11:11:36 crc kubenswrapper[4962]: I0220 11:11:36.934422 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerStarted","Data":"46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d"} Feb 20 11:11:36 crc kubenswrapper[4962]: I0220 11:11:36.958400 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gf65t" podStartSLOduration=1.5143869250000002 podStartE2EDuration="3.958377457s" podCreationTimestamp="2026-02-20 11:11:33 +0000 UTC" firstStartedPulling="2026-02-20 11:11:33.897294193 +0000 UTC m=+4585.479766039" lastFinishedPulling="2026-02-20 11:11:36.341284685 +0000 UTC m=+4587.923756571" observedRunningTime="2026-02-20 11:11:36.954085586 +0000 UTC m=+4588.536557462" watchObservedRunningTime="2026-02-20 11:11:36.958377457 +0000 UTC m=+4588.540849343" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.479848 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.481949 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.497938 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.633159 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.633463 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.633668 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.735629 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.735746 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.735797 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.736460 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.736790 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.776516 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") pod \"community-operators-n2k4b\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:37 crc kubenswrapper[4962]: I0220 11:11:37.798037 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:38 crc kubenswrapper[4962]: I0220 11:11:38.112690 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:38 crc kubenswrapper[4962]: I0220 11:11:38.957038 4962 generic.go:334] "Generic (PLEG): container finished" podID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerID="9ee74111c1ba86e5709005fcbe78e4bf5aa89be27bca01ef0b469ef1b5c60efd" exitCode=0 Feb 20 11:11:38 crc kubenswrapper[4962]: I0220 11:11:38.957143 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerDied","Data":"9ee74111c1ba86e5709005fcbe78e4bf5aa89be27bca01ef0b469ef1b5c60efd"} Feb 20 11:11:38 crc kubenswrapper[4962]: I0220 11:11:38.957340 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerStarted","Data":"044ddc12238627b5579928a6941e216bc98eccf5658fc5ebe3202f9d66cede0e"} Feb 20 11:11:39 crc kubenswrapper[4962]: I0220 11:11:39.966764 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerStarted","Data":"a2c53e38bc22bb8389498260263ad69325ab45f3c976482fce0fcee721e543fa"} Feb 20 11:11:40 crc kubenswrapper[4962]: I0220 11:11:40.976424 4962 generic.go:334] "Generic (PLEG): container finished" podID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerID="a2c53e38bc22bb8389498260263ad69325ab45f3c976482fce0fcee721e543fa" exitCode=0 Feb 20 11:11:40 crc kubenswrapper[4962]: I0220 11:11:40.976489 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerDied","Data":"a2c53e38bc22bb8389498260263ad69325ab45f3c976482fce0fcee721e543fa"} Feb 20 11:11:41 crc kubenswrapper[4962]: I0220 11:11:41.990076 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerStarted","Data":"4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0"} Feb 20 11:11:42 crc kubenswrapper[4962]: I0220 11:11:42.023735 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n2k4b" podStartSLOduration=2.422794915 podStartE2EDuration="5.023710711s" podCreationTimestamp="2026-02-20 11:11:37 +0000 UTC" firstStartedPulling="2026-02-20 11:11:38.959951134 +0000 UTC m=+4590.542422980" lastFinishedPulling="2026-02-20 11:11:41.56086692 +0000 UTC m=+4593.143338776" observedRunningTime="2026-02-20 11:11:42.015423427 +0000 UTC m=+4593.597895313" watchObservedRunningTime="2026-02-20 11:11:42.023710711 +0000 UTC m=+4593.606182597" Feb 20 11:11:43 crc kubenswrapper[4962]: I0220 11:11:43.428350 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:43 crc kubenswrapper[4962]: I0220 11:11:43.428526 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:44 crc kubenswrapper[4962]: I0220 11:11:44.484904 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gf65t" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" probeResult="failure" output=< Feb 20 11:11:44 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 11:11:44 crc kubenswrapper[4962]: > Feb 20 11:11:47 crc kubenswrapper[4962]: I0220 11:11:47.798878 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:47 crc kubenswrapper[4962]: I0220 11:11:47.799126 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:47 crc kubenswrapper[4962]: I0220 11:11:47.875595 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:48 crc kubenswrapper[4962]: I0220 11:11:48.116316 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:50 crc kubenswrapper[4962]: I0220 11:11:50.665814 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:50 crc kubenswrapper[4962]: I0220 11:11:50.666446 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n2k4b" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="registry-server" containerID="cri-o://4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0" gracePeriod=2 Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.075409 4962 generic.go:334] "Generic (PLEG): container finished" podID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerID="4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0" exitCode=0 Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.075914 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerDied","Data":"4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0"} Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.075943 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2k4b" event={"ID":"96f2df87-1a45-4012-9cb8-cfe7722350d6","Type":"ContainerDied","Data":"044ddc12238627b5579928a6941e216bc98eccf5658fc5ebe3202f9d66cede0e"} Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.075957 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="044ddc12238627b5579928a6941e216bc98eccf5658fc5ebe3202f9d66cede0e" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.103229 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.256660 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") pod \"96f2df87-1a45-4012-9cb8-cfe7722350d6\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.256711 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") pod \"96f2df87-1a45-4012-9cb8-cfe7722350d6\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.256736 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") pod \"96f2df87-1a45-4012-9cb8-cfe7722350d6\" (UID: \"96f2df87-1a45-4012-9cb8-cfe7722350d6\") " Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.258374 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities" (OuterVolumeSpecName: "utilities") pod "96f2df87-1a45-4012-9cb8-cfe7722350d6" (UID: "96f2df87-1a45-4012-9cb8-cfe7722350d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.267022 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh" (OuterVolumeSpecName: "kube-api-access-xk4gh") pod "96f2df87-1a45-4012-9cb8-cfe7722350d6" (UID: "96f2df87-1a45-4012-9cb8-cfe7722350d6"). InnerVolumeSpecName "kube-api-access-xk4gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.315173 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96f2df87-1a45-4012-9cb8-cfe7722350d6" (UID: "96f2df87-1a45-4012-9cb8-cfe7722350d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.358045 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.358089 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f2df87-1a45-4012-9cb8-cfe7722350d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:52 crc kubenswrapper[4962]: I0220 11:11:52.358108 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk4gh\" (UniqueName: \"kubernetes.io/projected/96f2df87-1a45-4012-9cb8-cfe7722350d6-kube-api-access-xk4gh\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.085568 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2k4b" Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.152084 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.159748 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n2k4b"] Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.505615 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:53 crc kubenswrapper[4962]: I0220 11:11:53.553020 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:55 crc kubenswrapper[4962]: I0220 11:11:55.149725 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" path="/var/lib/kubelet/pods/96f2df87-1a45-4012-9cb8-cfe7722350d6/volumes" Feb 20 11:11:55 crc kubenswrapper[4962]: I0220 11:11:55.868361 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:55 crc kubenswrapper[4962]: I0220 11:11:55.868793 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gf65t" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" containerID="cri-o://46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d" gracePeriod=2 Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.115037 4962 generic.go:334] "Generic (PLEG): container finished" podID="8123f297-3029-4d9d-a922-2b771aed43c0" containerID="46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d" exitCode=0 Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.115105 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerDied","Data":"46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d"} Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.400258 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.520992 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") pod \"8123f297-3029-4d9d-a922-2b771aed43c0\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.521484 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") pod \"8123f297-3029-4d9d-a922-2b771aed43c0\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.521796 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") pod \"8123f297-3029-4d9d-a922-2b771aed43c0\" (UID: \"8123f297-3029-4d9d-a922-2b771aed43c0\") " Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.532186 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p" (OuterVolumeSpecName: "kube-api-access-g7w7p") pod "8123f297-3029-4d9d-a922-2b771aed43c0" (UID: "8123f297-3029-4d9d-a922-2b771aed43c0"). InnerVolumeSpecName "kube-api-access-g7w7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.532915 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities" (OuterVolumeSpecName: "utilities") pod "8123f297-3029-4d9d-a922-2b771aed43c0" (UID: "8123f297-3029-4d9d-a922-2b771aed43c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.623803 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7w7p\" (UniqueName: \"kubernetes.io/projected/8123f297-3029-4d9d-a922-2b771aed43c0-kube-api-access-g7w7p\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.623879 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.670022 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8123f297-3029-4d9d-a922-2b771aed43c0" (UID: "8123f297-3029-4d9d-a922-2b771aed43c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:11:56 crc kubenswrapper[4962]: I0220 11:11:56.725380 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8123f297-3029-4d9d-a922-2b771aed43c0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.127566 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gf65t" event={"ID":"8123f297-3029-4d9d-a922-2b771aed43c0","Type":"ContainerDied","Data":"f0b7812f45f8eea28efbe319212752e44998e182a92f8aef3e652098743c2480"} Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.127648 4962 scope.go:117] "RemoveContainer" containerID="46b0d85e7bc6514ce0e6a1314fb4afc736066f8b657041944635fdc15ec9b17d" Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.128236 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gf65t" Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.154626 4962 scope.go:117] "RemoveContainer" containerID="236bd57b1e0c8740564a579c282fdc2b610fbbfad7b4be819e1c8f8557919d31" Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.182014 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.192678 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gf65t"] Feb 20 11:11:57 crc kubenswrapper[4962]: I0220 11:11:57.208398 4962 scope.go:117] "RemoveContainer" containerID="feabc8c4cf26db6fa376158de7d8e770dde59ef8215527032dd3ebcbccf03e3a" Feb 20 11:11:59 crc kubenswrapper[4962]: I0220 11:11:59.156967 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" path="/var/lib/kubelet/pods/8123f297-3029-4d9d-a922-2b771aed43c0/volumes" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978001 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978495 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="extract-utilities" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978507 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="extract-utilities" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978520 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="extract-content" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978526 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="extract-content" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978539 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978545 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978554 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="extract-content" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978560 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="extract-content" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978572 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978579 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: E0220 11:12:06.978601 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="extract-utilities" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978607 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="extract-utilities" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978723 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="8123f297-3029-4d9d-a922-2b771aed43c0" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.978741 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f2df87-1a45-4012-9cb8-cfe7722350d6" containerName="registry-server" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.979380 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.981371 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.981467 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7q92k" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.981468 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.982296 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 20 11:12:06 crc kubenswrapper[4962]: I0220 11:12:06.983097 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.014497 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-62bkq"] Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.016169 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.025954 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-62bkq"] Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.039529 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081723 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch6zx\" (UniqueName: \"kubernetes.io/projected/10021bed-f80b-491c-8326-88df1a07c1f7-kube-api-access-ch6zx\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081781 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081809 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-dns-svc\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081852 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081914 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-config\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.081953 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182812 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182875 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6zx\" (UniqueName: \"kubernetes.io/projected/10021bed-f80b-491c-8326-88df1a07c1f7-kube-api-access-ch6zx\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182895 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182914 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-dns-svc\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182954 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.182993 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-config\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.183681 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.183838 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-config\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.184047 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.184741 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10021bed-f80b-491c-8326-88df1a07c1f7-dns-svc\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.219771 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6zx\" (UniqueName: \"kubernetes.io/projected/10021bed-f80b-491c-8326-88df1a07c1f7-kube-api-access-ch6zx\") pod \"dnsmasq-dns-589cf688cc-62bkq\" (UID: \"10021bed-f80b-491c-8326-88df1a07c1f7\") " pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.226341 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") pod \"dnsmasq-dns-7c4c8f55b5-mg9lz\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.294908 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.335138 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.721054 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:07 crc kubenswrapper[4962]: I0220 11:12:07.787156 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-62bkq"] Feb 20 11:12:07 crc kubenswrapper[4962]: W0220 11:12:07.797123 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10021bed_f80b_491c_8326_88df1a07c1f7.slice/crio-e47db261c2bfbeaecffbda8de857bedb2294d41db3b322ddd15f80e8ba9d53f1 WatchSource:0}: Error finding container e47db261c2bfbeaecffbda8de857bedb2294d41db3b322ddd15f80e8ba9d53f1: Status 404 returned error can't find the container with id e47db261c2bfbeaecffbda8de857bedb2294d41db3b322ddd15f80e8ba9d53f1 Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.136137 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.137923 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.140681 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.140692 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.141367 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.141920 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gjq7z" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.143682 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.164674 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203056 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhgp7\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-kube-api-access-mhgp7\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203109 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203190 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203344 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203427 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203498 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203528 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203622 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.203653 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.239269 4962 generic.go:334] "Generic (PLEG): container finished" podID="a26573eb-419d-4ead-b747-2cc004252564" containerID="6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570" exitCode=0 Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.239337 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerDied","Data":"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570"} Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.239577 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerStarted","Data":"588013732c3631c15df27205354a3f5d50e66c6de81609620c23fca9c83b06f2"} Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.241491 4962 generic.go:334] "Generic (PLEG): container finished" podID="10021bed-f80b-491c-8326-88df1a07c1f7" containerID="d7bfd7bc38a8dbbd8ca5b64b18b808624fe45d14c18059530aa5d5534f381ec4" exitCode=0 Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.241532 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" event={"ID":"10021bed-f80b-491c-8326-88df1a07c1f7","Type":"ContainerDied","Data":"d7bfd7bc38a8dbbd8ca5b64b18b808624fe45d14c18059530aa5d5534f381ec4"} Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.241570 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" event={"ID":"10021bed-f80b-491c-8326-88df1a07c1f7","Type":"ContainerStarted","Data":"e47db261c2bfbeaecffbda8de857bedb2294d41db3b322ddd15f80e8ba9d53f1"} Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305046 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhgp7\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-kube-api-access-mhgp7\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305232 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305333 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305639 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.305911 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.306025 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.306098 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.306186 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.306250 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.309198 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.309484 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.310010 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.310126 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.314521 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.316219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.318324 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.318370 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/07545b54426f40b00bb4c13cc1f9fe59b7b4a09fd52fcbea77ec3ff6291e7b54/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.322311 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhgp7\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-kube-api-access-mhgp7\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.323235 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4f1374d6-d1c8-4b28-a524-485ced8ec7b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.371191 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ca3fa62-a0d4-4c0d-9b96-a93f8a0cedc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f1374d6-d1c8-4b28-a524-485ced8ec7b9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.467905 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:08 crc kubenswrapper[4962]: I0220 11:12:08.897404 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 11:12:08 crc kubenswrapper[4962]: W0220 11:12:08.904747 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1374d6_d1c8_4b28_a524_485ced8ec7b9.slice/crio-e1056d0a030531abe426fcf2534e980a4f222a18abda0f16601c5fd299837136 WatchSource:0}: Error finding container e1056d0a030531abe426fcf2534e980a4f222a18abda0f16601c5fd299837136: Status 404 returned error can't find the container with id e1056d0a030531abe426fcf2534e980a4f222a18abda0f16601c5fd299837136 Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.253155 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerStarted","Data":"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa"} Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.255617 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.255814 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" event={"ID":"10021bed-f80b-491c-8326-88df1a07c1f7","Type":"ContainerStarted","Data":"afbd6f1bea126f3c4967c55326ee4f10ece6c77611d27a88e91672c7cb7e01b8"} Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.256029 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.259263 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f1374d6-d1c8-4b28-a524-485ced8ec7b9","Type":"ContainerStarted","Data":"e1056d0a030531abe426fcf2534e980a4f222a18abda0f16601c5fd299837136"} Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.272332 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" podStartSLOduration=3.272311217 podStartE2EDuration="3.272311217s" podCreationTimestamp="2026-02-20 11:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:09.270817561 +0000 UTC m=+4620.853289447" watchObservedRunningTime="2026-02-20 11:12:09.272311217 +0000 UTC m=+4620.854783063" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.289035 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" podStartSLOduration=3.289019749 podStartE2EDuration="3.289019749s" podCreationTimestamp="2026-02-20 11:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:09.287420839 +0000 UTC m=+4620.869892685" watchObservedRunningTime="2026-02-20 11:12:09.289019749 +0000 UTC m=+4620.871491595" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.481189 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.483130 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.487354 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-s7pjk" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.487419 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.488647 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.490298 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.505270 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.511162 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642772 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642822 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgx9s\" (UniqueName: \"kubernetes.io/projected/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kube-api-access-hgx9s\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642959 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642983 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.642999 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-default\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.643019 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.643054 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kolla-config\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.745891 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746313 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-default\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746353 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746388 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kolla-config\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746417 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746439 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746461 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgx9s\" (UniqueName: \"kubernetes.io/projected/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kube-api-access-hgx9s\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.746524 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.749268 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-default\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.749521 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.749732 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.749849 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kolla-config\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.751459 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.751489 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/82af7e16ae6a26f5ab26562d44f8c6d5bb94bc45afb9982e337b9181f1f053f8/globalmount\"" pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.766997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.772820 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.788743 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgx9s\" (UniqueName: \"kubernetes.io/projected/f2ffa3bc-ffbe-4a42-b14f-48aa20546210-kube-api-access-hgx9s\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.793765 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f3c901b9-447f-4c6e-ac29-576dc096ce25\") pod \"openstack-galera-0\" (UID: \"f2ffa3bc-ffbe-4a42-b14f-48aa20546210\") " pod="openstack/openstack-galera-0" Feb 20 11:12:09 crc kubenswrapper[4962]: I0220 11:12:09.815545 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.070664 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.082326 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.084622 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.085105 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-q28bt" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.090448 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.200181 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.252815 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kolla-config\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.252851 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4gn5\" (UniqueName: \"kubernetes.io/projected/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kube-api-access-m4gn5\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.252876 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-config-data\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.268201 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2ffa3bc-ffbe-4a42-b14f-48aa20546210","Type":"ContainerStarted","Data":"185492e6f361b7a6ca14d565e54e7d3405c3d4ed383de858a246ebc0de2df704"} Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.356130 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kolla-config\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.356249 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4gn5\" (UniqueName: \"kubernetes.io/projected/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kube-api-access-m4gn5\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.356291 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-config-data\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.358128 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-config-data\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.358485 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kolla-config\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.373083 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4gn5\" (UniqueName: \"kubernetes.io/projected/8374d0f9-f4be-4f6b-88eb-4849a2be49e9-kube-api-access-m4gn5\") pod \"memcached-0\" (UID: \"8374d0f9-f4be-4f6b-88eb-4849a2be49e9\") " pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.418173 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 11:12:10 crc kubenswrapper[4962]: I0220 11:12:10.865947 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 11:12:10 crc kubenswrapper[4962]: W0220 11:12:10.867990 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8374d0f9_f4be_4f6b_88eb_4849a2be49e9.slice/crio-87ca5265edccf24fde85ac2152d83a73395c38eb59fdfed628e92b1693c91747 WatchSource:0}: Error finding container 87ca5265edccf24fde85ac2152d83a73395c38eb59fdfed628e92b1693c91747: Status 404 returned error can't find the container with id 87ca5265edccf24fde85ac2152d83a73395c38eb59fdfed628e92b1693c91747 Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.094070 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.095490 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.100332 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.100546 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.100725 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.100922 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mpsfv" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.108514 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.269440 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270145 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270219 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xtvz\" (UniqueName: \"kubernetes.io/projected/97ae547e-e977-4b15-a979-38415ee77885-kube-api-access-7xtvz\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270253 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270292 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ae547e-e977-4b15-a979-38415ee77885-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270340 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270393 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4212cde7-983e-4a29-b847-21233e7ce523\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4212cde7-983e-4a29-b847-21233e7ce523\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.270435 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.278750 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f1374d6-d1c8-4b28-a524-485ced8ec7b9","Type":"ContainerStarted","Data":"ea699931b3b6d1154f7563dc0e7c455f7597120eb0a07bf4c657255d11998dcb"} Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.280510 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2ffa3bc-ffbe-4a42-b14f-48aa20546210","Type":"ContainerStarted","Data":"2b549c0faa264bbd60d3323fce20c7bbdb66db97b9e039a78354cbddfca9fde0"} Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.284490 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8374d0f9-f4be-4f6b-88eb-4849a2be49e9","Type":"ContainerStarted","Data":"f9580210b6730fa5555dcbac945cd9238c8dea64942334e413e20cbfb558a662"} Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.284543 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8374d0f9-f4be-4f6b-88eb-4849a2be49e9","Type":"ContainerStarted","Data":"87ca5265edccf24fde85ac2152d83a73395c38eb59fdfed628e92b1693c91747"} Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.285168 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.351018 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.350992115 podStartE2EDuration="1.350992115s" podCreationTimestamp="2026-02-20 11:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:11.344259409 +0000 UTC m=+4622.926731255" watchObservedRunningTime="2026-02-20 11:12:11.350992115 +0000 UTC m=+4622.933463961" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372286 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372392 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372447 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xtvz\" (UniqueName: \"kubernetes.io/projected/97ae547e-e977-4b15-a979-38415ee77885-kube-api-access-7xtvz\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372485 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372526 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ae547e-e977-4b15-a979-38415ee77885-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372559 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372645 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4212cde7-983e-4a29-b847-21233e7ce523\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4212cde7-983e-4a29-b847-21233e7ce523\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.372688 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.377242 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.377399 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ae547e-e977-4b15-a979-38415ee77885-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.377429 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.378171 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ae547e-e977-4b15-a979-38415ee77885-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.379883 4962 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.380256 4962 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4212cde7-983e-4a29-b847-21233e7ce523\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4212cde7-983e-4a29-b847-21233e7ce523\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9baade218bdce1757add9c2b3a768cfc65cf332cf4ef6807977bf89c1521c62b/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.384398 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.385251 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ae547e-e977-4b15-a979-38415ee77885-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.410281 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xtvz\" (UniqueName: \"kubernetes.io/projected/97ae547e-e977-4b15-a979-38415ee77885-kube-api-access-7xtvz\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.414528 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4212cde7-983e-4a29-b847-21233e7ce523\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4212cde7-983e-4a29-b847-21233e7ce523\") pod \"openstack-cell1-galera-0\" (UID: \"97ae547e-e977-4b15-a979-38415ee77885\") " pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:11 crc kubenswrapper[4962]: I0220 11:12:11.712867 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:12 crc kubenswrapper[4962]: I0220 11:12:12.184648 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 11:12:12 crc kubenswrapper[4962]: I0220 11:12:12.298398 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"97ae547e-e977-4b15-a979-38415ee77885","Type":"ContainerStarted","Data":"6411c1a34832bdf05020f42bd380f9bcf0fb50e356efafde3c7279008c8478c1"} Feb 20 11:12:13 crc kubenswrapper[4962]: I0220 11:12:13.309286 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"97ae547e-e977-4b15-a979-38415ee77885","Type":"ContainerStarted","Data":"12ab0d1167bdad63132aac820b88ba53b430b6ab81fb79e02b3560ce10a49d27"} Feb 20 11:12:14 crc kubenswrapper[4962]: I0220 11:12:14.326642 4962 generic.go:334] "Generic (PLEG): container finished" podID="f2ffa3bc-ffbe-4a42-b14f-48aa20546210" containerID="2b549c0faa264bbd60d3323fce20c7bbdb66db97b9e039a78354cbddfca9fde0" exitCode=0 Feb 20 11:12:14 crc kubenswrapper[4962]: I0220 11:12:14.326712 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2ffa3bc-ffbe-4a42-b14f-48aa20546210","Type":"ContainerDied","Data":"2b549c0faa264bbd60d3323fce20c7bbdb66db97b9e039a78354cbddfca9fde0"} Feb 20 11:12:15 crc kubenswrapper[4962]: I0220 11:12:15.340734 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f2ffa3bc-ffbe-4a42-b14f-48aa20546210","Type":"ContainerStarted","Data":"71b411cb31dee0e1edef65607970ddb4ab5d4d6ff0b65b7389774037c8b712f7"} Feb 20 11:12:15 crc kubenswrapper[4962]: I0220 11:12:15.382670 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.382637163 podStartE2EDuration="7.382637163s" podCreationTimestamp="2026-02-20 11:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:15.374354819 +0000 UTC m=+4626.956826725" watchObservedRunningTime="2026-02-20 11:12:15.382637163 +0000 UTC m=+4626.965109069" Feb 20 11:12:15 crc kubenswrapper[4962]: I0220 11:12:15.420149 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 20 11:12:16 crc kubenswrapper[4962]: I0220 11:12:16.351520 4962 generic.go:334] "Generic (PLEG): container finished" podID="97ae547e-e977-4b15-a979-38415ee77885" containerID="12ab0d1167bdad63132aac820b88ba53b430b6ab81fb79e02b3560ce10a49d27" exitCode=0 Feb 20 11:12:16 crc kubenswrapper[4962]: I0220 11:12:16.351652 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"97ae547e-e977-4b15-a979-38415ee77885","Type":"ContainerDied","Data":"12ab0d1167bdad63132aac820b88ba53b430b6ab81fb79e02b3560ce10a49d27"} Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.296875 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.337284 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589cf688cc-62bkq" Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.377708 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"97ae547e-e977-4b15-a979-38415ee77885","Type":"ContainerStarted","Data":"7c8d74176f301a3b06823f2556b09390524ebb2cd8385ab222b25d929d3aae70"} Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.409554 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.409810 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="dnsmasq-dns" containerID="cri-o://fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" gracePeriod=10 Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.420019 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.419995626 podStartE2EDuration="7.419995626s" podCreationTimestamp="2026-02-20 11:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:17.414801207 +0000 UTC m=+4628.997273093" watchObservedRunningTime="2026-02-20 11:12:17.419995626 +0000 UTC m=+4629.002467482" Feb 20 11:12:17 crc kubenswrapper[4962]: I0220 11:12:17.890070 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.013993 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") pod \"a26573eb-419d-4ead-b747-2cc004252564\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.014127 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") pod \"a26573eb-419d-4ead-b747-2cc004252564\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.014201 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") pod \"a26573eb-419d-4ead-b747-2cc004252564\" (UID: \"a26573eb-419d-4ead-b747-2cc004252564\") " Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.024811 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl" (OuterVolumeSpecName: "kube-api-access-rv2pl") pod "a26573eb-419d-4ead-b747-2cc004252564" (UID: "a26573eb-419d-4ead-b747-2cc004252564"). InnerVolumeSpecName "kube-api-access-rv2pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.065197 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a26573eb-419d-4ead-b747-2cc004252564" (UID: "a26573eb-419d-4ead-b747-2cc004252564"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.068444 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config" (OuterVolumeSpecName: "config") pod "a26573eb-419d-4ead-b747-2cc004252564" (UID: "a26573eb-419d-4ead-b747-2cc004252564"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.116607 4962 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-config\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.116639 4962 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a26573eb-419d-4ead-b747-2cc004252564-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.116649 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv2pl\" (UniqueName: \"kubernetes.io/projected/a26573eb-419d-4ead-b747-2cc004252564-kube-api-access-rv2pl\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.389960 4962 generic.go:334] "Generic (PLEG): container finished" podID="a26573eb-419d-4ead-b747-2cc004252564" containerID="fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" exitCode=0 Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.390050 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.390089 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerDied","Data":"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa"} Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.390219 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-mg9lz" event={"ID":"a26573eb-419d-4ead-b747-2cc004252564","Type":"ContainerDied","Data":"588013732c3631c15df27205354a3f5d50e66c6de81609620c23fca9c83b06f2"} Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.390265 4962 scope.go:117] "RemoveContainer" containerID="fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.421434 4962 scope.go:117] "RemoveContainer" containerID="6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.443201 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.450782 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-mg9lz"] Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.659328 4962 scope.go:117] "RemoveContainer" containerID="fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" Feb 20 11:12:18 crc kubenswrapper[4962]: E0220 11:12:18.660047 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa\": container with ID starting with fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa not found: ID does not exist" containerID="fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.660097 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa"} err="failed to get container status \"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa\": rpc error: code = NotFound desc = could not find container \"fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa\": container with ID starting with fa284a0334c381e440549c57de6765ba5295cf19c660d26d36e6197ccacc88fa not found: ID does not exist" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.660131 4962 scope.go:117] "RemoveContainer" containerID="6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570" Feb 20 11:12:18 crc kubenswrapper[4962]: E0220 11:12:18.660483 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570\": container with ID starting with 6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570 not found: ID does not exist" containerID="6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570" Feb 20 11:12:18 crc kubenswrapper[4962]: I0220 11:12:18.660520 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570"} err="failed to get container status \"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570\": rpc error: code = NotFound desc = could not find container \"6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570\": container with ID starting with 6689f5e10dfc12c1ac744c290572285d70a93d46570a4d4716bc2776fb56e570 not found: ID does not exist" Feb 20 11:12:19 crc kubenswrapper[4962]: I0220 11:12:19.152278 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a26573eb-419d-4ead-b747-2cc004252564" path="/var/lib/kubelet/pods/a26573eb-419d-4ead-b747-2cc004252564/volumes" Feb 20 11:12:19 crc kubenswrapper[4962]: I0220 11:12:19.815695 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 20 11:12:19 crc kubenswrapper[4962]: I0220 11:12:19.815774 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 20 11:12:21 crc kubenswrapper[4962]: I0220 11:12:21.714833 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:21 crc kubenswrapper[4962]: I0220 11:12:21.715214 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:22 crc kubenswrapper[4962]: I0220 11:12:22.237928 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 20 11:12:22 crc kubenswrapper[4962]: I0220 11:12:22.307583 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 20 11:12:24 crc kubenswrapper[4962]: I0220 11:12:24.133876 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:24 crc kubenswrapper[4962]: I0220 11:12:24.248960 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.461634 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:28 crc kubenswrapper[4962]: E0220 11:12:28.462565 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="init" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.462610 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="init" Feb 20 11:12:28 crc kubenswrapper[4962]: E0220 11:12:28.462621 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="dnsmasq-dns" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.462628 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="dnsmasq-dns" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.462779 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26573eb-419d-4ead-b747-2cc004252564" containerName="dnsmasq-dns" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.463521 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.466224 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.491747 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.613896 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.614295 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.716315 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.717137 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.718441 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.750217 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") pod \"root-account-create-update-bnx24\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:28 crc kubenswrapper[4962]: I0220 11:12:28.797425 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:29 crc kubenswrapper[4962]: I0220 11:12:29.322842 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:29 crc kubenswrapper[4962]: W0220 11:12:29.589771 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd14b3c9f_d912_4f57_8df9_6b20338707e5.slice/crio-719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786 WatchSource:0}: Error finding container 719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786: Status 404 returned error can't find the container with id 719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786 Feb 20 11:12:30 crc kubenswrapper[4962]: I0220 11:12:30.536017 4962 generic.go:334] "Generic (PLEG): container finished" podID="d14b3c9f-d912-4f57-8df9-6b20338707e5" containerID="f56f77a29f18b053790ac8a764373585312883ee763879c9fe012ee4ec5c65e1" exitCode=0 Feb 20 11:12:30 crc kubenswrapper[4962]: I0220 11:12:30.536145 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bnx24" event={"ID":"d14b3c9f-d912-4f57-8df9-6b20338707e5","Type":"ContainerDied","Data":"f56f77a29f18b053790ac8a764373585312883ee763879c9fe012ee4ec5c65e1"} Feb 20 11:12:30 crc kubenswrapper[4962]: I0220 11:12:30.536383 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bnx24" event={"ID":"d14b3c9f-d912-4f57-8df9-6b20338707e5","Type":"ContainerStarted","Data":"719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786"} Feb 20 11:12:31 crc kubenswrapper[4962]: I0220 11:12:31.936776 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.070643 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") pod \"d14b3c9f-d912-4f57-8df9-6b20338707e5\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.070765 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") pod \"d14b3c9f-d912-4f57-8df9-6b20338707e5\" (UID: \"d14b3c9f-d912-4f57-8df9-6b20338707e5\") " Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.071821 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d14b3c9f-d912-4f57-8df9-6b20338707e5" (UID: "d14b3c9f-d912-4f57-8df9-6b20338707e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.080981 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n" (OuterVolumeSpecName: "kube-api-access-7kr6n") pod "d14b3c9f-d912-4f57-8df9-6b20338707e5" (UID: "d14b3c9f-d912-4f57-8df9-6b20338707e5"). InnerVolumeSpecName "kube-api-access-7kr6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.171918 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d14b3c9f-d912-4f57-8df9-6b20338707e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.171946 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kr6n\" (UniqueName: \"kubernetes.io/projected/d14b3c9f-d912-4f57-8df9-6b20338707e5-kube-api-access-7kr6n\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.564877 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bnx24" event={"ID":"d14b3c9f-d912-4f57-8df9-6b20338707e5","Type":"ContainerDied","Data":"719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786"} Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.564919 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="719fe85849ea4372a7fe8171ec3eb50211cfeef7e8aee9dbfa2f94e6c8e96786" Feb 20 11:12:32 crc kubenswrapper[4962]: I0220 11:12:32.564949 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bnx24" Feb 20 11:12:35 crc kubenswrapper[4962]: I0220 11:12:35.060056 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:35 crc kubenswrapper[4962]: I0220 11:12:35.066668 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bnx24"] Feb 20 11:12:35 crc kubenswrapper[4962]: I0220 11:12:35.151180 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14b3c9f-d912-4f57-8df9-6b20338707e5" path="/var/lib/kubelet/pods/d14b3c9f-d912-4f57-8df9-6b20338707e5/volumes" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.078048 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:12:40 crc kubenswrapper[4962]: E0220 11:12:40.079123 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14b3c9f-d912-4f57-8df9-6b20338707e5" containerName="mariadb-account-create-update" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.079140 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14b3c9f-d912-4f57-8df9-6b20338707e5" containerName="mariadb-account-create-update" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.079312 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14b3c9f-d912-4f57-8df9-6b20338707e5" containerName="mariadb-account-create-update" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.079964 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.082859 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.090638 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.157953 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.158783 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.260580 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.260767 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.262057 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.297767 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") pod \"root-account-create-update-2mzpw\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.402857 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:40 crc kubenswrapper[4962]: I0220 11:12:40.936084 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:12:40 crc kubenswrapper[4962]: W0220 11:12:40.945678 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ae4e019_31b7_4826_a5ef_042faba6034d.slice/crio-203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f WatchSource:0}: Error finding container 203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f: Status 404 returned error can't find the container with id 203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f Feb 20 11:12:41 crc kubenswrapper[4962]: I0220 11:12:41.672165 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2mzpw" event={"ID":"7ae4e019-31b7-4826-a5ef-042faba6034d","Type":"ContainerDied","Data":"8b9ab6691837647b0967e622ecfd0e62f3ce7907b1cef344ccbbdd1bcb192e5e"} Feb 20 11:12:41 crc kubenswrapper[4962]: I0220 11:12:41.672202 4962 generic.go:334] "Generic (PLEG): container finished" podID="7ae4e019-31b7-4826-a5ef-042faba6034d" containerID="8b9ab6691837647b0967e622ecfd0e62f3ce7907b1cef344ccbbdd1bcb192e5e" exitCode=0 Feb 20 11:12:41 crc kubenswrapper[4962]: I0220 11:12:41.672685 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2mzpw" event={"ID":"7ae4e019-31b7-4826-a5ef-042faba6034d","Type":"ContainerStarted","Data":"203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f"} Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.095965 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.229293 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") pod \"7ae4e019-31b7-4826-a5ef-042faba6034d\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.229477 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") pod \"7ae4e019-31b7-4826-a5ef-042faba6034d\" (UID: \"7ae4e019-31b7-4826-a5ef-042faba6034d\") " Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.230298 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ae4e019-31b7-4826-a5ef-042faba6034d" (UID: "7ae4e019-31b7-4826-a5ef-042faba6034d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.235820 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx" (OuterVolumeSpecName: "kube-api-access-5msxx") pod "7ae4e019-31b7-4826-a5ef-042faba6034d" (UID: "7ae4e019-31b7-4826-a5ef-042faba6034d"). InnerVolumeSpecName "kube-api-access-5msxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.331734 4962 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ae4e019-31b7-4826-a5ef-042faba6034d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.331782 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5msxx\" (UniqueName: \"kubernetes.io/projected/7ae4e019-31b7-4826-a5ef-042faba6034d-kube-api-access-5msxx\") on node \"crc\" DevicePath \"\"" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.694419 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2mzpw" event={"ID":"7ae4e019-31b7-4826-a5ef-042faba6034d","Type":"ContainerDied","Data":"203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f"} Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.694468 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="203c8b2fcebe4a3406af7761b1fe51a8a1566ed7f565b6294a208e0b9e48a60f" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.694994 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2mzpw" Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.696974 4962 generic.go:334] "Generic (PLEG): container finished" podID="4f1374d6-d1c8-4b28-a524-485ced8ec7b9" containerID="ea699931b3b6d1154f7563dc0e7c455f7597120eb0a07bf4c657255d11998dcb" exitCode=0 Feb 20 11:12:43 crc kubenswrapper[4962]: I0220 11:12:43.697010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f1374d6-d1c8-4b28-a524-485ced8ec7b9","Type":"ContainerDied","Data":"ea699931b3b6d1154f7563dc0e7c455f7597120eb0a07bf4c657255d11998dcb"} Feb 20 11:12:44 crc kubenswrapper[4962]: I0220 11:12:44.707463 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f1374d6-d1c8-4b28-a524-485ced8ec7b9","Type":"ContainerStarted","Data":"601e743ff0ed3c58a890cd7e80ea114736992d584eb37952d8b0e4d680f2e7e7"} Feb 20 11:12:44 crc kubenswrapper[4962]: I0220 11:12:44.708575 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:12:44 crc kubenswrapper[4962]: I0220 11:12:44.733310 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.733285392 podStartE2EDuration="37.733285392s" podCreationTimestamp="2026-02-20 11:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 11:12:44.727221917 +0000 UTC m=+4656.309693763" watchObservedRunningTime="2026-02-20 11:12:44.733285392 +0000 UTC m=+4656.315757278" Feb 20 11:12:58 crc kubenswrapper[4962]: I0220 11:12:58.472909 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 11:13:41 crc kubenswrapper[4962]: I0220 11:13:41.508667 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:13:41 crc kubenswrapper[4962]: I0220 11:13:41.510761 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:14:11 crc kubenswrapper[4962]: I0220 11:14:11.507884 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:14:11 crc kubenswrapper[4962]: I0220 11:14:11.508718 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.508472 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.509096 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.509152 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.509786 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.509873 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26" gracePeriod=600 Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.922258 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26" exitCode=0 Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.922338 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26"} Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.922696 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae"} Feb 20 11:14:41 crc kubenswrapper[4962]: I0220 11:14:41.922732 4962 scope.go:117] "RemoveContainer" containerID="e9330f612fad194ce74c43dcd96cb17f9c60e761905737e1f0f1b3362edac70b" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.161950 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn"] Feb 20 11:15:00 crc kubenswrapper[4962]: E0220 11:15:00.163266 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae4e019-31b7-4826-a5ef-042faba6034d" containerName="mariadb-account-create-update" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.163298 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae4e019-31b7-4826-a5ef-042faba6034d" containerName="mariadb-account-create-update" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.165130 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae4e019-31b7-4826-a5ef-042faba6034d" containerName="mariadb-account-create-update" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.166795 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.170569 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.172558 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.193060 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn"] Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.283570 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.284100 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.284279 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.385575 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.385733 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.385860 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.387219 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.395923 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.405841 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") pod \"collect-profiles-29526435-dnbhn\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.506294 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:00 crc kubenswrapper[4962]: I0220 11:15:00.779673 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn"] Feb 20 11:15:01 crc kubenswrapper[4962]: I0220 11:15:01.090585 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" event={"ID":"77a85f36-7811-4d2e-86d2-84c9a8aa1e54","Type":"ContainerStarted","Data":"03c92e9b0e9071dee5a479d5565338cdd2b211bcbc7dfe472fff5850387bc236"} Feb 20 11:15:01 crc kubenswrapper[4962]: I0220 11:15:01.091014 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" event={"ID":"77a85f36-7811-4d2e-86d2-84c9a8aa1e54","Type":"ContainerStarted","Data":"b4791a388c03bb6e76b408164a65c6e1448fd3bc7edcd44a80b231d64bffb4fa"} Feb 20 11:15:02 crc kubenswrapper[4962]: I0220 11:15:02.112960 4962 generic.go:334] "Generic (PLEG): container finished" podID="77a85f36-7811-4d2e-86d2-84c9a8aa1e54" containerID="03c92e9b0e9071dee5a479d5565338cdd2b211bcbc7dfe472fff5850387bc236" exitCode=0 Feb 20 11:15:02 crc kubenswrapper[4962]: I0220 11:15:02.113034 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" event={"ID":"77a85f36-7811-4d2e-86d2-84c9a8aa1e54","Type":"ContainerDied","Data":"03c92e9b0e9071dee5a479d5565338cdd2b211bcbc7dfe472fff5850387bc236"} Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.484167 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.637133 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") pod \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.637343 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") pod \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.637468 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") pod \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\" (UID: \"77a85f36-7811-4d2e-86d2-84c9a8aa1e54\") " Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.638382 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume" (OuterVolumeSpecName: "config-volume") pod "77a85f36-7811-4d2e-86d2-84c9a8aa1e54" (UID: "77a85f36-7811-4d2e-86d2-84c9a8aa1e54"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.643758 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "77a85f36-7811-4d2e-86d2-84c9a8aa1e54" (UID: "77a85f36-7811-4d2e-86d2-84c9a8aa1e54"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.645622 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc" (OuterVolumeSpecName: "kube-api-access-dn5bc") pod "77a85f36-7811-4d2e-86d2-84c9a8aa1e54" (UID: "77a85f36-7811-4d2e-86d2-84c9a8aa1e54"). InnerVolumeSpecName "kube-api-access-dn5bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.740016 4962 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.740072 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn5bc\" (UniqueName: \"kubernetes.io/projected/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-kube-api-access-dn5bc\") on node \"crc\" DevicePath \"\"" Feb 20 11:15:03 crc kubenswrapper[4962]: I0220 11:15:03.740092 4962 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a85f36-7811-4d2e-86d2-84c9a8aa1e54-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.136149 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" event={"ID":"77a85f36-7811-4d2e-86d2-84c9a8aa1e54","Type":"ContainerDied","Data":"b4791a388c03bb6e76b408164a65c6e1448fd3bc7edcd44a80b231d64bffb4fa"} Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.136193 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4791a388c03bb6e76b408164a65c6e1448fd3bc7edcd44a80b231d64bffb4fa" Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.136278 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526435-dnbhn" Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.597567 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 11:15:04 crc kubenswrapper[4962]: I0220 11:15:04.606322 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526390-2v7sg"] Feb 20 11:15:05 crc kubenswrapper[4962]: I0220 11:15:05.157555 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a57c99f-e682-43fc-85be-d6ca9b32dd2e" path="/var/lib/kubelet/pods/6a57c99f-e682-43fc-85be-d6ca9b32dd2e/volumes" Feb 20 11:15:17 crc kubenswrapper[4962]: I0220 11:15:17.254751 4962 scope.go:117] "RemoveContainer" containerID="672fb8e4a3790f1f70ac1c9ed16383d55019a5b81bdd7e7049f12caa51ab0535" Feb 20 11:15:17 crc kubenswrapper[4962]: I0220 11:15:17.303023 4962 scope.go:117] "RemoveContainer" containerID="0fe78591e142b00ce0a1305d692098cbd58316c27a1f913e16fe16879c83db51" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.621407 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:00 crc kubenswrapper[4962]: E0220 11:16:00.622688 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a85f36-7811-4d2e-86d2-84c9a8aa1e54" containerName="collect-profiles" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.622713 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a85f36-7811-4d2e-86d2-84c9a8aa1e54" containerName="collect-profiles" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.622971 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a85f36-7811-4d2e-86d2-84c9a8aa1e54" containerName="collect-profiles" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.625028 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.633895 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.791724 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.791835 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.791883 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.893182 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.893496 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.893582 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.894084 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.896216 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.926428 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") pod \"certified-operators-c98x6\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:00 crc kubenswrapper[4962]: I0220 11:16:00.971621 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:01 crc kubenswrapper[4962]: I0220 11:16:01.428336 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:01 crc kubenswrapper[4962]: I0220 11:16:01.701995 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerStarted","Data":"ea600c71e0345a8043751e8b826aee7f560702337e800b5defa9d6a7fb10b53f"} Feb 20 11:16:02 crc kubenswrapper[4962]: I0220 11:16:02.712494 4962 generic.go:334] "Generic (PLEG): container finished" podID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerID="b4d681ba38ab243d90aebb0cd3f5e0d964a3b05eff1c6a9189b417f8bc499f51" exitCode=0 Feb 20 11:16:02 crc kubenswrapper[4962]: I0220 11:16:02.712677 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerDied","Data":"b4d681ba38ab243d90aebb0cd3f5e0d964a3b05eff1c6a9189b417f8bc499f51"} Feb 20 11:16:03 crc kubenswrapper[4962]: I0220 11:16:03.727958 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerStarted","Data":"cc4df7336e0c93c42160fd50ab2c566dcfda96d76ab5ecee6e26256c4e0e35c7"} Feb 20 11:16:04 crc kubenswrapper[4962]: I0220 11:16:04.741811 4962 generic.go:334] "Generic (PLEG): container finished" podID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerID="cc4df7336e0c93c42160fd50ab2c566dcfda96d76ab5ecee6e26256c4e0e35c7" exitCode=0 Feb 20 11:16:04 crc kubenswrapper[4962]: I0220 11:16:04.741934 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerDied","Data":"cc4df7336e0c93c42160fd50ab2c566dcfda96d76ab5ecee6e26256c4e0e35c7"} Feb 20 11:16:05 crc kubenswrapper[4962]: I0220 11:16:05.757718 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerStarted","Data":"1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb"} Feb 20 11:16:05 crc kubenswrapper[4962]: I0220 11:16:05.792078 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c98x6" podStartSLOduration=3.283329453 podStartE2EDuration="5.792052917s" podCreationTimestamp="2026-02-20 11:16:00 +0000 UTC" firstStartedPulling="2026-02-20 11:16:02.714464458 +0000 UTC m=+4854.296936314" lastFinishedPulling="2026-02-20 11:16:05.223187902 +0000 UTC m=+4856.805659778" observedRunningTime="2026-02-20 11:16:05.787823218 +0000 UTC m=+4857.370295104" watchObservedRunningTime="2026-02-20 11:16:05.792052917 +0000 UTC m=+4857.374524803" Feb 20 11:16:10 crc kubenswrapper[4962]: I0220 11:16:10.972844 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:10 crc kubenswrapper[4962]: I0220 11:16:10.973481 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:11 crc kubenswrapper[4962]: I0220 11:16:11.056948 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:11 crc kubenswrapper[4962]: I0220 11:16:11.893743 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:11 crc kubenswrapper[4962]: I0220 11:16:11.940388 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:13 crc kubenswrapper[4962]: I0220 11:16:13.843312 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c98x6" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="registry-server" containerID="cri-o://1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb" gracePeriod=2 Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.855342 4962 generic.go:334] "Generic (PLEG): container finished" podID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerID="1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb" exitCode=0 Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.855910 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerDied","Data":"1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb"} Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.857004 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98x6" event={"ID":"f80e860a-4910-4de7-9daf-3ecf4808b002","Type":"ContainerDied","Data":"ea600c71e0345a8043751e8b826aee7f560702337e800b5defa9d6a7fb10b53f"} Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.857078 4962 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea600c71e0345a8043751e8b826aee7f560702337e800b5defa9d6a7fb10b53f" Feb 20 11:16:14 crc kubenswrapper[4962]: I0220 11:16:14.902942 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.046501 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") pod \"f80e860a-4910-4de7-9daf-3ecf4808b002\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.046685 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") pod \"f80e860a-4910-4de7-9daf-3ecf4808b002\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.046716 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") pod \"f80e860a-4910-4de7-9daf-3ecf4808b002\" (UID: \"f80e860a-4910-4de7-9daf-3ecf4808b002\") " Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.047931 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities" (OuterVolumeSpecName: "utilities") pod "f80e860a-4910-4de7-9daf-3ecf4808b002" (UID: "f80e860a-4910-4de7-9daf-3ecf4808b002"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.055152 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj" (OuterVolumeSpecName: "kube-api-access-hb7wj") pod "f80e860a-4910-4de7-9daf-3ecf4808b002" (UID: "f80e860a-4910-4de7-9daf-3ecf4808b002"). InnerVolumeSpecName "kube-api-access-hb7wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.136880 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f80e860a-4910-4de7-9daf-3ecf4808b002" (UID: "f80e860a-4910-4de7-9daf-3ecf4808b002"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.148883 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb7wj\" (UniqueName: \"kubernetes.io/projected/f80e860a-4910-4de7-9daf-3ecf4808b002-kube-api-access-hb7wj\") on node \"crc\" DevicePath \"\"" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.148951 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.148977 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80e860a-4910-4de7-9daf-3ecf4808b002-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.867858 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98x6" Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.933673 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:15 crc kubenswrapper[4962]: I0220 11:16:15.946069 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c98x6"] Feb 20 11:16:17 crc kubenswrapper[4962]: I0220 11:16:17.151912 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" path="/var/lib/kubelet/pods/f80e860a-4910-4de7-9daf-3ecf4808b002/volumes" Feb 20 11:16:41 crc kubenswrapper[4962]: I0220 11:16:41.508014 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:16:41 crc kubenswrapper[4962]: I0220 11:16:41.508725 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:17:11 crc kubenswrapper[4962]: I0220 11:17:11.508478 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:17:11 crc kubenswrapper[4962]: I0220 11:17:11.509216 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.508138 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.508953 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.509021 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.509962 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.510085 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" gracePeriod=600 Feb 20 11:17:41 crc kubenswrapper[4962]: E0220 11:17:41.641461 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.675902 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" exitCode=0 Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.675980 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae"} Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.676033 4962 scope.go:117] "RemoveContainer" containerID="dd10955f5a138a9a68bbf20164340f26aec2bb4444e31423c39b6d847050ae26" Feb 20 11:17:41 crc kubenswrapper[4962]: I0220 11:17:41.676796 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:17:41 crc kubenswrapper[4962]: E0220 11:17:41.677173 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:17:52 crc kubenswrapper[4962]: I0220 11:17:52.138645 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:17:52 crc kubenswrapper[4962]: E0220 11:17:52.139320 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.882671 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:17:53 crc kubenswrapper[4962]: E0220 11:17:53.883421 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="extract-content" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.883443 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="extract-content" Feb 20 11:17:53 crc kubenswrapper[4962]: E0220 11:17:53.883648 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="registry-server" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.883717 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="registry-server" Feb 20 11:17:53 crc kubenswrapper[4962]: E0220 11:17:53.883780 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="extract-utilities" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.883795 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="extract-utilities" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.884044 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80e860a-4910-4de7-9daf-3ecf4808b002" containerName="registry-server" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.885789 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.897354 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.966223 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.966283 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:53 crc kubenswrapper[4962]: I0220 11:17:53.966655 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.067887 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.067942 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.067964 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.068416 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.068896 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.101102 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") pod \"redhat-marketplace-8zmzk\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.210862 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:17:54 crc kubenswrapper[4962]: I0220 11:17:54.757052 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:17:55 crc kubenswrapper[4962]: W0220 11:17:55.149818 4962 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod839b832b_5799_4bcf_b028_f6d138668d44.slice/crio-45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61 WatchSource:0}: Error finding container 45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61: Status 404 returned error can't find the container with id 45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61 Feb 20 11:17:55 crc kubenswrapper[4962]: I0220 11:17:55.821466 4962 generic.go:334] "Generic (PLEG): container finished" podID="839b832b-5799-4bcf-b028-f6d138668d44" containerID="d02121a7865602e447d9672ca12527892a8b4df01784c7fe6472f665dc92d541" exitCode=0 Feb 20 11:17:55 crc kubenswrapper[4962]: I0220 11:17:55.821890 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerDied","Data":"d02121a7865602e447d9672ca12527892a8b4df01784c7fe6472f665dc92d541"} Feb 20 11:17:55 crc kubenswrapper[4962]: I0220 11:17:55.821928 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerStarted","Data":"45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61"} Feb 20 11:17:55 crc kubenswrapper[4962]: I0220 11:17:55.823867 4962 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 11:17:56 crc kubenswrapper[4962]: I0220 11:17:56.837804 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerStarted","Data":"b6401eca9c5ccd1616136765494ee1804d00cfe7d4b24f779b2d66665195c8ff"} Feb 20 11:17:57 crc kubenswrapper[4962]: I0220 11:17:57.850667 4962 generic.go:334] "Generic (PLEG): container finished" podID="839b832b-5799-4bcf-b028-f6d138668d44" containerID="b6401eca9c5ccd1616136765494ee1804d00cfe7d4b24f779b2d66665195c8ff" exitCode=0 Feb 20 11:17:57 crc kubenswrapper[4962]: I0220 11:17:57.850740 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerDied","Data":"b6401eca9c5ccd1616136765494ee1804d00cfe7d4b24f779b2d66665195c8ff"} Feb 20 11:17:58 crc kubenswrapper[4962]: I0220 11:17:58.861454 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerStarted","Data":"ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e"} Feb 20 11:17:58 crc kubenswrapper[4962]: I0220 11:17:58.900510 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8zmzk" podStartSLOduration=3.452509998 podStartE2EDuration="5.900480737s" podCreationTimestamp="2026-02-20 11:17:53 +0000 UTC" firstStartedPulling="2026-02-20 11:17:55.823480165 +0000 UTC m=+4967.405952041" lastFinishedPulling="2026-02-20 11:17:58.271450894 +0000 UTC m=+4969.853922780" observedRunningTime="2026-02-20 11:17:58.89399599 +0000 UTC m=+4970.476467886" watchObservedRunningTime="2026-02-20 11:17:58.900480737 +0000 UTC m=+4970.482952593" Feb 20 11:18:04 crc kubenswrapper[4962]: I0220 11:18:04.211236 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:04 crc kubenswrapper[4962]: I0220 11:18:04.212466 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:04 crc kubenswrapper[4962]: I0220 11:18:04.287013 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:05 crc kubenswrapper[4962]: I0220 11:18:05.010910 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:05 crc kubenswrapper[4962]: I0220 11:18:05.091166 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:18:05 crc kubenswrapper[4962]: I0220 11:18:05.139935 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:05 crc kubenswrapper[4962]: E0220 11:18:05.140342 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:18:06 crc kubenswrapper[4962]: I0220 11:18:06.950303 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8zmzk" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="registry-server" containerID="cri-o://ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e" gracePeriod=2 Feb 20 11:18:07 crc kubenswrapper[4962]: I0220 11:18:07.964026 4962 generic.go:334] "Generic (PLEG): container finished" podID="839b832b-5799-4bcf-b028-f6d138668d44" containerID="ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e" exitCode=0 Feb 20 11:18:07 crc kubenswrapper[4962]: I0220 11:18:07.964094 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerDied","Data":"ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e"} Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.560021 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.728077 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") pod \"839b832b-5799-4bcf-b028-f6d138668d44\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.728159 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") pod \"839b832b-5799-4bcf-b028-f6d138668d44\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.728260 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") pod \"839b832b-5799-4bcf-b028-f6d138668d44\" (UID: \"839b832b-5799-4bcf-b028-f6d138668d44\") " Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.729738 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities" (OuterVolumeSpecName: "utilities") pod "839b832b-5799-4bcf-b028-f6d138668d44" (UID: "839b832b-5799-4bcf-b028-f6d138668d44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.737354 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg" (OuterVolumeSpecName: "kube-api-access-rgvqg") pod "839b832b-5799-4bcf-b028-f6d138668d44" (UID: "839b832b-5799-4bcf-b028-f6d138668d44"). InnerVolumeSpecName "kube-api-access-rgvqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.774630 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "839b832b-5799-4bcf-b028-f6d138668d44" (UID: "839b832b-5799-4bcf-b028-f6d138668d44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.829730 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.829785 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/839b832b-5799-4bcf-b028-f6d138668d44-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.829813 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgvqg\" (UniqueName: \"kubernetes.io/projected/839b832b-5799-4bcf-b028-f6d138668d44-kube-api-access-rgvqg\") on node \"crc\" DevicePath \"\"" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.978479 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zmzk" event={"ID":"839b832b-5799-4bcf-b028-f6d138668d44","Type":"ContainerDied","Data":"45aebe52af9db615e922a96f885c6199b792863852c9fdc58224e9b5a8d28b61"} Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.978567 4962 scope.go:117] "RemoveContainer" containerID="ba978c4b17a448437f2e3d666913798190e79acd9116a9903a0378029be2493e" Feb 20 11:18:08 crc kubenswrapper[4962]: I0220 11:18:08.978568 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zmzk" Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.017183 4962 scope.go:117] "RemoveContainer" containerID="b6401eca9c5ccd1616136765494ee1804d00cfe7d4b24f779b2d66665195c8ff" Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.043377 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.055412 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zmzk"] Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.056493 4962 scope.go:117] "RemoveContainer" containerID="d02121a7865602e447d9672ca12527892a8b4df01784c7fe6472f665dc92d541" Feb 20 11:18:09 crc kubenswrapper[4962]: I0220 11:18:09.156731 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="839b832b-5799-4bcf-b028-f6d138668d44" path="/var/lib/kubelet/pods/839b832b-5799-4bcf-b028-f6d138668d44/volumes" Feb 20 11:18:17 crc kubenswrapper[4962]: I0220 11:18:17.433326 4962 scope.go:117] "RemoveContainer" containerID="4ccbd6d45940c4b1ae7e0e1f68c265065827d64d2c523d3f5bea75e59a5d57b0" Feb 20 11:18:17 crc kubenswrapper[4962]: I0220 11:18:17.465468 4962 scope.go:117] "RemoveContainer" containerID="9ee74111c1ba86e5709005fcbe78e4bf5aa89be27bca01ef0b469ef1b5c60efd" Feb 20 11:18:17 crc kubenswrapper[4962]: I0220 11:18:17.493102 4962 scope.go:117] "RemoveContainer" containerID="a2c53e38bc22bb8389498260263ad69325ab45f3c976482fce0fcee721e543fa" Feb 20 11:18:18 crc kubenswrapper[4962]: I0220 11:18:18.139730 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:18 crc kubenswrapper[4962]: E0220 11:18:18.140162 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418064 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:18:26 crc kubenswrapper[4962]: E0220 11:18:26.418727 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="extract-utilities" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418739 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="extract-utilities" Feb 20 11:18:26 crc kubenswrapper[4962]: E0220 11:18:26.418760 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="registry-server" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418766 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="registry-server" Feb 20 11:18:26 crc kubenswrapper[4962]: E0220 11:18:26.418781 4962 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="extract-content" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418787 4962 state_mem.go:107] "Deleted CPUSet assignment" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="extract-content" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.418907 4962 memory_manager.go:354] "RemoveStaleState removing state" podUID="839b832b-5799-4bcf-b028-f6d138668d44" containerName="registry-server" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.419587 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.420980 4962 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4hjrx"/"default-dockercfg-xdbvn" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.421439 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4hjrx"/"openshift-service-ca.crt" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.421519 4962 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4hjrx"/"kube-root-ca.crt" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.469033 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.562909 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.562987 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.664396 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.664501 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.664877 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.684209 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") pod \"must-gather-vfvhw\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:26 crc kubenswrapper[4962]: I0220 11:18:26.733007 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:18:27 crc kubenswrapper[4962]: I0220 11:18:27.011776 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:18:27 crc kubenswrapper[4962]: I0220 11:18:27.158449 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" event={"ID":"fd1e5da5-b553-419b-b874-baa4d0b09f1d","Type":"ContainerStarted","Data":"c6cd07a28d727227c91863658be837f8ea3bd5357196bf48a15bd86c017219d9"} Feb 20 11:18:30 crc kubenswrapper[4962]: I0220 11:18:30.139506 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:30 crc kubenswrapper[4962]: E0220 11:18:30.140493 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:18:34 crc kubenswrapper[4962]: I0220 11:18:34.239555 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" event={"ID":"fd1e5da5-b553-419b-b874-baa4d0b09f1d","Type":"ContainerStarted","Data":"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be"} Feb 20 11:18:34 crc kubenswrapper[4962]: I0220 11:18:34.241075 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" event={"ID":"fd1e5da5-b553-419b-b874-baa4d0b09f1d","Type":"ContainerStarted","Data":"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592"} Feb 20 11:18:34 crc kubenswrapper[4962]: I0220 11:18:34.260189 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" podStartSLOduration=1.741347285 podStartE2EDuration="8.260166537s" podCreationTimestamp="2026-02-20 11:18:26 +0000 UTC" firstStartedPulling="2026-02-20 11:18:27.013141892 +0000 UTC m=+4998.595613738" lastFinishedPulling="2026-02-20 11:18:33.531961144 +0000 UTC m=+5005.114432990" observedRunningTime="2026-02-20 11:18:34.254900477 +0000 UTC m=+5005.837372343" watchObservedRunningTime="2026-02-20 11:18:34.260166537 +0000 UTC m=+5005.842638423" Feb 20 11:18:44 crc kubenswrapper[4962]: I0220 11:18:44.138888 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:44 crc kubenswrapper[4962]: E0220 11:18:44.140948 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:18:55 crc kubenswrapper[4962]: I0220 11:18:55.138743 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:18:55 crc kubenswrapper[4962]: E0220 11:18:55.141510 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:09 crc kubenswrapper[4962]: I0220 11:19:09.149364 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:09 crc kubenswrapper[4962]: E0220 11:19:09.150313 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:17 crc kubenswrapper[4962]: I0220 11:19:17.620202 4962 scope.go:117] "RemoveContainer" containerID="f56f77a29f18b053790ac8a764373585312883ee763879c9fe012ee4ec5c65e1" Feb 20 11:19:20 crc kubenswrapper[4962]: I0220 11:19:20.139918 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:20 crc kubenswrapper[4962]: E0220 11:19:20.140903 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.335845 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-589cf688cc-62bkq_10021bed-f80b-491c-8326-88df1a07c1f7/init/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.486648 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-589cf688cc-62bkq_10021bed-f80b-491c-8326-88df1a07c1f7/init/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.507109 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-589cf688cc-62bkq_10021bed-f80b-491c-8326-88df1a07c1f7/dnsmasq-dns/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.664290 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8374d0f9-f4be-4f6b-88eb-4849a2be49e9/memcached/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.707852 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_97ae547e-e977-4b15-a979-38415ee77885/mysql-bootstrap/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.881534 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_97ae547e-e977-4b15-a979-38415ee77885/galera/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.914201 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_97ae547e-e977-4b15-a979-38415ee77885/mysql-bootstrap/0.log" Feb 20 11:19:28 crc kubenswrapper[4962]: I0220 11:19:28.946209 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f2ffa3bc-ffbe-4a42-b14f-48aa20546210/mysql-bootstrap/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.117801 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4f1374d6-d1c8-4b28-a524-485ced8ec7b9/setup-container/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.121556 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f2ffa3bc-ffbe-4a42-b14f-48aa20546210/galera/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.139056 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f2ffa3bc-ffbe-4a42-b14f-48aa20546210/mysql-bootstrap/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.359992 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4f1374d6-d1c8-4b28-a524-485ced8ec7b9/setup-container/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.379575 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4f1374d6-d1c8-4b28-a524-485ced8ec7b9/rabbitmq/0.log" Feb 20 11:19:29 crc kubenswrapper[4962]: I0220 11:19:29.433708 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-2mzpw_7ae4e019-31b7-4826-a5ef-042faba6034d/mariadb-account-create-update/0.log" Feb 20 11:19:31 crc kubenswrapper[4962]: I0220 11:19:31.138827 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:31 crc kubenswrapper[4962]: E0220 11:19:31.139426 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:44 crc kubenswrapper[4962]: I0220 11:19:44.139155 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:44 crc kubenswrapper[4962]: E0220 11:19:44.140036 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.519319 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/util/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.685363 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/util/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.730504 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/pull/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.750327 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/pull/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.928518 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/extract/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.939635 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/util/0.log" Feb 20 11:19:48 crc kubenswrapper[4962]: I0220 11:19:48.980765 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79678nmbg_3db1f907-b4ac-45b1-9f38-93727dfde270/pull/0.log" Feb 20 11:19:49 crc kubenswrapper[4962]: I0220 11:19:49.340514 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-r2t72_cf0e10ba-c175-44c3-9011-6646f21ba334/manager/0.log" Feb 20 11:19:49 crc kubenswrapper[4962]: I0220 11:19:49.638340 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-wcqzf_ea986843-26e4-4410-a65e-ae51c02dc04c/manager/0.log" Feb 20 11:19:49 crc kubenswrapper[4962]: I0220 11:19:49.809437 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-75vx4_fee6970c-0ad7-46ea-ab75-dcb7d552ffbb/manager/0.log" Feb 20 11:19:49 crc kubenswrapper[4962]: I0220 11:19:49.995812 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-rhhc7_12f33757-f329-47a6-9273-bdeb1558a4d7/manager/0.log" Feb 20 11:19:50 crc kubenswrapper[4962]: I0220 11:19:50.499936 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-2hg4n_5fec06f1-8ccf-403c-88de-2b581f056802/manager/0.log" Feb 20 11:19:50 crc kubenswrapper[4962]: I0220 11:19:50.580542 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-4rnhn_0c8c62e9-0201-43a4-b823-82af87a0977e/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.013799 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-jjbwt_7afb870a-75a4-42d5-9704-5cef14dd3ce9/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.231580 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-6lvhz_a9979be5-6650-425b-a748-51e2cb552413/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.473001 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-wn92v_f8f1dca9-8b83-469d-b834-3f11376576c9/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.482994 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-bsq9n_ac33f7ed-c3f8-487d-89dc-4a614d357b86/manager/0.log" Feb 20 11:19:51 crc kubenswrapper[4962]: I0220 11:19:51.695457 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-knwp9_6fdeab3e-de35-4d69-9e67-e5d8257bc25d/manager/0.log" Feb 20 11:19:52 crc kubenswrapper[4962]: I0220 11:19:52.228025 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-d2clq_4e2614ed-ea7a-430e-af7b-4d66f05f7b96/manager/0.log" Feb 20 11:19:52 crc kubenswrapper[4962]: I0220 11:19:52.408844 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-68trf_f8de466d-f069-4a8e-8598-72a163525c24/manager/0.log" Feb 20 11:19:52 crc kubenswrapper[4962]: I0220 11:19:52.776326 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-n5hm2_ad363690-9ad6-4f45-ac02-d51ec41d213b/operator/0.log" Feb 20 11:19:53 crc kubenswrapper[4962]: I0220 11:19:53.047214 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-t9zxk_46f437ac-c97a-4af9-92e7-6bec63b7d8d8/registry-server/0.log" Feb 20 11:19:53 crc kubenswrapper[4962]: I0220 11:19:53.518858 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-nlq5k_72728d52-a8e9-4689-8da0-871f250f7664/manager/0.log" Feb 20 11:19:53 crc kubenswrapper[4962]: I0220 11:19:53.654498 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-x4gh4_34cb38e0-7c0a-4f00-89e9-9be7b394585d/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.005711 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-rqmzz_98bbcdbd-382d-48ca-aa14-3e9ba4b63c98/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.014108 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5mrjv_5691d6ef-dedb-4a46-a1b6-0435e9f6db0a/operator/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.190028 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-9pxbg_4a325f02-ddda-49e9-9ef0-40fd4726b09f/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.345566 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-mfpm9_7d077bc6-8a1e-426a-9b2d-8e6b2a5eb084/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.421096 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-lxl4x_32d42cbd-4ea1-49cc-b9d4-33fe5f655a16/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.615032 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-kthxs_4c8bff11-1a85-4f9b-8fb2-defd04ac22d1/manager/0.log" Feb 20 11:19:54 crc kubenswrapper[4962]: I0220 11:19:54.836986 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-ln4sp_14efe385-5147-49ed-a42f-804b91438a55/manager/0.log" Feb 20 11:19:57 crc kubenswrapper[4962]: I0220 11:19:57.138138 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:19:57 crc kubenswrapper[4962]: E0220 11:19:57.138342 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:20:00 crc kubenswrapper[4962]: I0220 11:20:00.604235 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-nhpg5_e0560856-ed00-4ea8-8ce7-a801f1d46489/manager/0.log" Feb 20 11:20:10 crc kubenswrapper[4962]: I0220 11:20:10.139058 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:20:10 crc kubenswrapper[4962]: E0220 11:20:10.140142 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:20:17 crc kubenswrapper[4962]: I0220 11:20:17.252345 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hc9h5_75c3ba8d-4548-4407-9188-a785ef05da2c/control-plane-machine-set-operator/0.log" Feb 20 11:20:17 crc kubenswrapper[4962]: I0220 11:20:17.355986 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ckmh2_a7a9fa76-da75-4847-a539-d1e6bb57da98/kube-rbac-proxy/0.log" Feb 20 11:20:17 crc kubenswrapper[4962]: I0220 11:20:17.415774 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ckmh2_a7a9fa76-da75-4847-a539-d1e6bb57da98/machine-api-operator/0.log" Feb 20 11:20:24 crc kubenswrapper[4962]: I0220 11:20:24.139802 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:20:24 crc kubenswrapper[4962]: E0220 11:20:24.141226 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:20:31 crc kubenswrapper[4962]: I0220 11:20:31.741102 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-ctc7p_41c6ef1c-4069-44b1-a0ba-de5e820a630c/cert-manager-controller/0.log" Feb 20 11:20:31 crc kubenswrapper[4962]: I0220 11:20:31.900931 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-lvkdh_4e81b5fc-4c0d-4065-a88b-9fa40ea1d1b3/cert-manager-cainjector/0.log" Feb 20 11:20:31 crc kubenswrapper[4962]: I0220 11:20:31.945856 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-t5nv4_0d86f751-d081-47b7-a623-a9cc14ab43f7/cert-manager-webhook/0.log" Feb 20 11:20:37 crc kubenswrapper[4962]: I0220 11:20:37.139103 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:20:37 crc kubenswrapper[4962]: E0220 11:20:37.140133 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.403981 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-2hqd7_e17e90c9-fe19-4544-9a79-bffc8072a763/nmstate-console-plugin/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.602101 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-frtsf_5056ae4f-c2f7-41f5-8e12-b7b5d8996852/nmstate-handler/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.747111 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-6x8wh_edcc687e-09ef-4048-8db7-d67e6fe23212/nmstate-metrics/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.774517 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-6x8wh_edcc687e-09ef-4048-8db7-d67e6fe23212/kube-rbac-proxy/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.913011 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-nkzm2_cffc71cf-18b7-4733-b863-19b8664b5cf4/nmstate-operator/0.log" Feb 20 11:20:46 crc kubenswrapper[4962]: I0220 11:20:46.983980 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-l2lqb_a453e12b-e95c-4c04-b67b-b5bc6527a3ab/nmstate-webhook/0.log" Feb 20 11:20:52 crc kubenswrapper[4962]: I0220 11:20:52.139211 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:20:52 crc kubenswrapper[4962]: E0220 11:20:52.140050 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:06 crc kubenswrapper[4962]: I0220 11:21:06.140110 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:21:06 crc kubenswrapper[4962]: E0220 11:21:06.141077 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:18 crc kubenswrapper[4962]: I0220 11:21:18.867139 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-29wdn_7af7ee52-8865-48ce-85e5-7b62fb0d67d3/kube-rbac-proxy/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.176373 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-hb87m_7135845d-f595-42df-9773-7701c9a0b2e2/frr-k8s-webhook-server/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.214649 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-29wdn_7af7ee52-8865-48ce-85e5-7b62fb0d67d3/controller/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.313191 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-frr-files/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.486062 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-reloader/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.487782 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-metrics/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.498280 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-frr-files/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.536799 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-reloader/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.675153 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-frr-files/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.707984 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-metrics/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.708679 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-metrics/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.742311 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-reloader/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.886335 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-frr-files/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.897317 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/controller/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.909906 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-reloader/0.log" Feb 20 11:21:19 crc kubenswrapper[4962]: I0220 11:21:19.910796 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/cp-metrics/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.130873 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/frr-metrics/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.137390 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/kube-rbac-proxy-frr/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.137824 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/kube-rbac-proxy/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.138178 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:21:20 crc kubenswrapper[4962]: E0220 11:21:20.138366 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.301997 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/reloader/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.351448 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7964458f8b-6fxbj_403ba47d-bbe1-48f6-9382-47f12bbb75ae/manager/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.526254 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79fb478cb4-wmzpd_2ae49f4e-271b-40e8-9cfc-9857fc2de6f3/webhook-server/0.log" Feb 20 11:21:20 crc kubenswrapper[4962]: I0220 11:21:20.732655 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rx2lw_c8b5efc7-c8c4-4492-a8a9-31eaecfa8374/kube-rbac-proxy/0.log" Feb 20 11:21:21 crc kubenswrapper[4962]: I0220 11:21:21.099087 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rx2lw_c8b5efc7-c8c4-4492-a8a9-31eaecfa8374/speaker/0.log" Feb 20 11:21:21 crc kubenswrapper[4962]: I0220 11:21:21.344636 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zf82t_3eb8e16a-ffc3-4756-a3ee-96473eecf85d/frr/0.log" Feb 20 11:21:34 crc kubenswrapper[4962]: I0220 11:21:34.138896 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:21:34 crc kubenswrapper[4962]: E0220 11:21:34.139909 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.066450 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.214077 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.250376 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.286526 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.439114 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.453750 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/extract/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.529322 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5qft8z_9536c987-ff07-45d5-b8c8-12cfe3019427/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.599079 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.784996 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.807507 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/util/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.840190 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/pull/0.log" Feb 20 11:21:35 crc kubenswrapper[4962]: I0220 11:21:35.978214 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/extract/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.002529 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/util/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.017363 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21389xtp_15223064-e16f-4407-a15a-2105151aa73f/pull/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.151999 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-utilities/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.308866 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-utilities/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.355329 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.379290 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.536783 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.542496 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/extract-utilities/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.740058 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-utilities/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.748365 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tzxjg_f21d4aaf-2f5d-4576-a1e1-b8c233e285f1/registry-server/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.876684 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.913986 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-content/0.log" Feb 20 11:21:36 crc kubenswrapper[4962]: I0220 11:21:36.922497 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-utilities/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.098614 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-utilities/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.108311 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/extract-content/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.287367 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/util/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.525925 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/pull/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.572358 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/util/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.645555 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/pull/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.751502 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/util/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.775465 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/pull/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.806691 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lsx57_16b4ee5b-87f1-4b91-abd0-d2a7eb56e7bd/registry-server/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.835036 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca4n626_650d9c53-94de-499d-8498-53afa3428c06/extract/0.log" Feb 20 11:21:37 crc kubenswrapper[4962]: I0220 11:21:37.925933 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-mzhb4_34e2f7a3-366d-4817-a502-720b5f9a782e/marketplace-operator/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.002610 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.151738 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.166716 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.204011 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.358145 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.372929 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.545332 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.546589 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl4km_0e92c119-6503-4fc1-b607-0d41d821e8fe/registry-server/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.560974 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.562339 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.572498 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.687950 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.687996 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.688221 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.789866 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.789936 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.789962 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.790401 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.790443 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.803855 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.805514 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-content/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.809655 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") pod \"redhat-operators-ftsbz\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.812203 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-utilities/0.log" Feb 20 11:21:38 crc kubenswrapper[4962]: I0220 11:21:38.914276 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.003258 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-utilities/0.log" Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.028151 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/extract-content/0.log" Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.392827 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.677156 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerID="e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225" exitCode=0 Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.677197 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerDied","Data":"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225"} Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.677221 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerStarted","Data":"b1094c83d369f3971efc0ed7014799eaab5e95ee8bff0716f242a67ae96948ae"} Feb 20 11:21:39 crc kubenswrapper[4962]: I0220 11:21:39.714188 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j9hxw_82f8db6b-4715-42f3-a705-821af9e03156/registry-server/0.log" Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.767582 4962 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.771808 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.783225 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.922135 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.922176 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:40 crc kubenswrapper[4962]: I0220 11:21:40.922195 4962 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.023711 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.023752 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.023766 4962 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.024229 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.024233 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.044997 4962 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") pod \"community-operators-94hk8\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.150774 4962 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.608526 4962 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.699162 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerID="221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38" exitCode=0 Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.699242 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerDied","Data":"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38"} Feb 20 11:21:41 crc kubenswrapper[4962]: I0220 11:21:41.700574 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerStarted","Data":"a4efcf7fb75161d9e5487760d1e0134b75e87ef06f4fca980c54ba2517209850"} Feb 20 11:21:42 crc kubenswrapper[4962]: I0220 11:21:42.711617 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerStarted","Data":"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c"} Feb 20 11:21:42 crc kubenswrapper[4962]: I0220 11:21:42.713176 4962 generic.go:334] "Generic (PLEG): container finished" podID="49deb74e-5930-4583-9544-bbc0c34723d6" containerID="7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2" exitCode=0 Feb 20 11:21:42 crc kubenswrapper[4962]: I0220 11:21:42.713213 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerDied","Data":"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2"} Feb 20 11:21:42 crc kubenswrapper[4962]: I0220 11:21:42.746361 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ftsbz" podStartSLOduration=2.279025758 podStartE2EDuration="4.746344955s" podCreationTimestamp="2026-02-20 11:21:38 +0000 UTC" firstStartedPulling="2026-02-20 11:21:39.679223504 +0000 UTC m=+5191.261695350" lastFinishedPulling="2026-02-20 11:21:42.146542701 +0000 UTC m=+5193.729014547" observedRunningTime="2026-02-20 11:21:42.743479268 +0000 UTC m=+5194.325951134" watchObservedRunningTime="2026-02-20 11:21:42.746344955 +0000 UTC m=+5194.328816801" Feb 20 11:21:43 crc kubenswrapper[4962]: I0220 11:21:43.723695 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerStarted","Data":"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7"} Feb 20 11:21:44 crc kubenswrapper[4962]: I0220 11:21:44.736270 4962 generic.go:334] "Generic (PLEG): container finished" podID="49deb74e-5930-4583-9544-bbc0c34723d6" containerID="f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7" exitCode=0 Feb 20 11:21:44 crc kubenswrapper[4962]: I0220 11:21:44.736395 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerDied","Data":"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7"} Feb 20 11:21:45 crc kubenswrapper[4962]: I0220 11:21:45.754262 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerStarted","Data":"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4"} Feb 20 11:21:45 crc kubenswrapper[4962]: I0220 11:21:45.778086 4962 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-94hk8" podStartSLOduration=3.251112895 podStartE2EDuration="5.778069938s" podCreationTimestamp="2026-02-20 11:21:40 +0000 UTC" firstStartedPulling="2026-02-20 11:21:42.715468405 +0000 UTC m=+5194.297940251" lastFinishedPulling="2026-02-20 11:21:45.242425418 +0000 UTC m=+5196.824897294" observedRunningTime="2026-02-20 11:21:45.776438219 +0000 UTC m=+5197.358910075" watchObservedRunningTime="2026-02-20 11:21:45.778069938 +0000 UTC m=+5197.360541784" Feb 20 11:21:48 crc kubenswrapper[4962]: I0220 11:21:48.138915 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:21:48 crc kubenswrapper[4962]: E0220 11:21:48.139229 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:21:48 crc kubenswrapper[4962]: I0220 11:21:48.915411 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:48 crc kubenswrapper[4962]: I0220 11:21:48.917136 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:49 crc kubenswrapper[4962]: I0220 11:21:49.980169 4962 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ftsbz" podUID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerName="registry-server" probeResult="failure" output=< Feb 20 11:21:49 crc kubenswrapper[4962]: timeout: failed to connect service ":50051" within 1s Feb 20 11:21:49 crc kubenswrapper[4962]: > Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.156213 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.156714 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.228238 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.883529 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:51 crc kubenswrapper[4962]: I0220 11:21:51.944220 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:53 crc kubenswrapper[4962]: I0220 11:21:53.836777 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-94hk8" podUID="49deb74e-5930-4583-9544-bbc0c34723d6" containerName="registry-server" containerID="cri-o://4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" gracePeriod=2 Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.801314 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844255 4962 generic.go:334] "Generic (PLEG): container finished" podID="49deb74e-5930-4583-9544-bbc0c34723d6" containerID="4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" exitCode=0 Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844302 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerDied","Data":"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4"} Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844331 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94hk8" event={"ID":"49deb74e-5930-4583-9544-bbc0c34723d6","Type":"ContainerDied","Data":"a4efcf7fb75161d9e5487760d1e0134b75e87ef06f4fca980c54ba2517209850"} Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844368 4962 scope.go:117] "RemoveContainer" containerID="4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.844497 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94hk8" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.860016 4962 scope.go:117] "RemoveContainer" containerID="f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.877030 4962 scope.go:117] "RemoveContainer" containerID="7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.899565 4962 scope.go:117] "RemoveContainer" containerID="4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" Feb 20 11:21:54 crc kubenswrapper[4962]: E0220 11:21:54.900036 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4\": container with ID starting with 4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4 not found: ID does not exist" containerID="4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.900099 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4"} err="failed to get container status \"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4\": rpc error: code = NotFound desc = could not find container \"4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4\": container with ID starting with 4adb0d1b4f7344acd16758ce9ac2ddc289f4e2feb51fade6dd6dffd5cf3661e4 not found: ID does not exist" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.900139 4962 scope.go:117] "RemoveContainer" containerID="f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7" Feb 20 11:21:54 crc kubenswrapper[4962]: E0220 11:21:54.900418 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7\": container with ID starting with f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7 not found: ID does not exist" containerID="f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.900445 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7"} err="failed to get container status \"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7\": rpc error: code = NotFound desc = could not find container \"f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7\": container with ID starting with f62ef2c2130f32e20759fd62379ebf9ebc6406abf8c06f796cfa51cb7ee06cd7 not found: ID does not exist" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.900465 4962 scope.go:117] "RemoveContainer" containerID="7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2" Feb 20 11:21:54 crc kubenswrapper[4962]: E0220 11:21:54.902248 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2\": container with ID starting with 7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2 not found: ID does not exist" containerID="7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.902287 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2"} err="failed to get container status \"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2\": rpc error: code = NotFound desc = could not find container \"7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2\": container with ID starting with 7368b543db4cb01c205252d5b229bd817a1d145e9ab79b56de1ba207087c5ee2 not found: ID does not exist" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.983453 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") pod \"49deb74e-5930-4583-9544-bbc0c34723d6\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.983521 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") pod \"49deb74e-5930-4583-9544-bbc0c34723d6\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.983634 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") pod \"49deb74e-5930-4583-9544-bbc0c34723d6\" (UID: \"49deb74e-5930-4583-9544-bbc0c34723d6\") " Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.985015 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities" (OuterVolumeSpecName: "utilities") pod "49deb74e-5930-4583-9544-bbc0c34723d6" (UID: "49deb74e-5930-4583-9544-bbc0c34723d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:21:54 crc kubenswrapper[4962]: I0220 11:21:54.997773 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6" (OuterVolumeSpecName: "kube-api-access-82xw6") pod "49deb74e-5930-4583-9544-bbc0c34723d6" (UID: "49deb74e-5930-4583-9544-bbc0c34723d6"). InnerVolumeSpecName "kube-api-access-82xw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.045105 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49deb74e-5930-4583-9544-bbc0c34723d6" (UID: "49deb74e-5930-4583-9544-bbc0c34723d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.086079 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.086129 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49deb74e-5930-4583-9544-bbc0c34723d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.086151 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82xw6\" (UniqueName: \"kubernetes.io/projected/49deb74e-5930-4583-9544-bbc0c34723d6-kube-api-access-82xw6\") on node \"crc\" DevicePath \"\"" Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.196186 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:55 crc kubenswrapper[4962]: I0220 11:21:55.203304 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-94hk8"] Feb 20 11:21:57 crc kubenswrapper[4962]: I0220 11:21:57.152282 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49deb74e-5930-4583-9544-bbc0c34723d6" path="/var/lib/kubelet/pods/49deb74e-5930-4583-9544-bbc0c34723d6/volumes" Feb 20 11:21:58 crc kubenswrapper[4962]: I0220 11:21:58.992897 4962 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:59 crc kubenswrapper[4962]: I0220 11:21:59.067946 4962 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:21:59 crc kubenswrapper[4962]: I0220 11:21:59.250566 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:22:00 crc kubenswrapper[4962]: I0220 11:22:00.893816 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ftsbz" podUID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerName="registry-server" containerID="cri-o://e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" gracePeriod=2 Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.139410 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:22:01 crc kubenswrapper[4962]: E0220 11:22:01.139788 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.421143 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.594125 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") pod \"fc360947-b1e2-4ac0-8447-c7e886e036a0\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.594175 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") pod \"fc360947-b1e2-4ac0-8447-c7e886e036a0\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.594229 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") pod \"fc360947-b1e2-4ac0-8447-c7e886e036a0\" (UID: \"fc360947-b1e2-4ac0-8447-c7e886e036a0\") " Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.595139 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities" (OuterVolumeSpecName: "utilities") pod "fc360947-b1e2-4ac0-8447-c7e886e036a0" (UID: "fc360947-b1e2-4ac0-8447-c7e886e036a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.607788 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j" (OuterVolumeSpecName: "kube-api-access-2sb4j") pod "fc360947-b1e2-4ac0-8447-c7e886e036a0" (UID: "fc360947-b1e2-4ac0-8447-c7e886e036a0"). InnerVolumeSpecName "kube-api-access-2sb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.695735 4962 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.695766 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sb4j\" (UniqueName: \"kubernetes.io/projected/fc360947-b1e2-4ac0-8447-c7e886e036a0-kube-api-access-2sb4j\") on node \"crc\" DevicePath \"\"" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.714582 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc360947-b1e2-4ac0-8447-c7e886e036a0" (UID: "fc360947-b1e2-4ac0-8447-c7e886e036a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.796913 4962 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc360947-b1e2-4ac0-8447-c7e886e036a0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.903936 4962 generic.go:334] "Generic (PLEG): container finished" podID="fc360947-b1e2-4ac0-8447-c7e886e036a0" containerID="e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" exitCode=0 Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.903977 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerDied","Data":"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c"} Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.904010 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftsbz" event={"ID":"fc360947-b1e2-4ac0-8447-c7e886e036a0","Type":"ContainerDied","Data":"b1094c83d369f3971efc0ed7014799eaab5e95ee8bff0716f242a67ae96948ae"} Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.904010 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftsbz" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.904027 4962 scope.go:117] "RemoveContainer" containerID="e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.927272 4962 scope.go:117] "RemoveContainer" containerID="221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.939628 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.948928 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ftsbz"] Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.963879 4962 scope.go:117] "RemoveContainer" containerID="e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.993208 4962 scope.go:117] "RemoveContainer" containerID="e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" Feb 20 11:22:01 crc kubenswrapper[4962]: E0220 11:22:01.993643 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c\": container with ID starting with e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c not found: ID does not exist" containerID="e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.993711 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c"} err="failed to get container status \"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c\": rpc error: code = NotFound desc = could not find container \"e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c\": container with ID starting with e5ee0c7bad043042ae742961cb97d63a254a1fcbbe130d4913cf8cc3e92fe43c not found: ID does not exist" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.993735 4962 scope.go:117] "RemoveContainer" containerID="221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38" Feb 20 11:22:01 crc kubenswrapper[4962]: E0220 11:22:01.994476 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38\": container with ID starting with 221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38 not found: ID does not exist" containerID="221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.994507 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38"} err="failed to get container status \"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38\": rpc error: code = NotFound desc = could not find container \"221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38\": container with ID starting with 221fd23a4e709df8b60c21c80fbd11e2148c3d37647b1d50dbb4bd980c9c4a38 not found: ID does not exist" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.994534 4962 scope.go:117] "RemoveContainer" containerID="e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225" Feb 20 11:22:01 crc kubenswrapper[4962]: E0220 11:22:01.994862 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225\": container with ID starting with e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225 not found: ID does not exist" containerID="e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225" Feb 20 11:22:01 crc kubenswrapper[4962]: I0220 11:22:01.994907 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225"} err="failed to get container status \"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225\": rpc error: code = NotFound desc = could not find container \"e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225\": container with ID starting with e73d42be6f5a37c535756c7cf72bac403fa58ed816b1473e558c276bf2d70225 not found: ID does not exist" Feb 20 11:22:03 crc kubenswrapper[4962]: I0220 11:22:03.154277 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc360947-b1e2-4ac0-8447-c7e886e036a0" path="/var/lib/kubelet/pods/fc360947-b1e2-4ac0-8447-c7e886e036a0/volumes" Feb 20 11:22:16 crc kubenswrapper[4962]: I0220 11:22:16.139876 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:22:16 crc kubenswrapper[4962]: E0220 11:22:16.140942 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:22:17 crc kubenswrapper[4962]: I0220 11:22:17.709336 4962 scope.go:117] "RemoveContainer" containerID="b4d681ba38ab243d90aebb0cd3f5e0d964a3b05eff1c6a9189b417f8bc499f51" Feb 20 11:22:17 crc kubenswrapper[4962]: I0220 11:22:17.766705 4962 scope.go:117] "RemoveContainer" containerID="cc4df7336e0c93c42160fd50ab2c566dcfda96d76ab5ecee6e26256c4e0e35c7" Feb 20 11:22:17 crc kubenswrapper[4962]: I0220 11:22:17.809134 4962 scope.go:117] "RemoveContainer" containerID="1c894dfd10e3ea0973c4a9f38552b1c9dae05591935995fa639e6204d2604dcb" Feb 20 11:22:29 crc kubenswrapper[4962]: I0220 11:22:29.150147 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:22:29 crc kubenswrapper[4962]: E0220 11:22:29.150937 4962 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-m9d46_openshift-machine-config-operator(751d5e0b-919c-4777-8475-ed7214f7647f)\"" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" Feb 20 11:22:42 crc kubenswrapper[4962]: I0220 11:22:42.139848 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" Feb 20 11:22:43 crc kubenswrapper[4962]: I0220 11:22:43.318062 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26"} Feb 20 11:22:44 crc kubenswrapper[4962]: I0220 11:22:44.074013 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:22:44 crc kubenswrapper[4962]: I0220 11:22:44.081990 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2mzpw"] Feb 20 11:22:45 crc kubenswrapper[4962]: I0220 11:22:45.148924 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae4e019-31b7-4826-a5ef-042faba6034d" path="/var/lib/kubelet/pods/7ae4e019-31b7-4826-a5ef-042faba6034d/volumes" Feb 20 11:22:56 crc kubenswrapper[4962]: I0220 11:22:56.441151 4962 generic.go:334] "Generic (PLEG): container finished" podID="fd1e5da5-b553-419b-b874-baa4d0b09f1d" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" exitCode=0 Feb 20 11:22:56 crc kubenswrapper[4962]: I0220 11:22:56.441275 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" event={"ID":"fd1e5da5-b553-419b-b874-baa4d0b09f1d","Type":"ContainerDied","Data":"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592"} Feb 20 11:22:56 crc kubenswrapper[4962]: I0220 11:22:56.442539 4962 scope.go:117] "RemoveContainer" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" Feb 20 11:22:56 crc kubenswrapper[4962]: I0220 11:22:56.742334 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4hjrx_must-gather-vfvhw_fd1e5da5-b553-419b-b874-baa4d0b09f1d/gather/0.log" Feb 20 11:23:03 crc kubenswrapper[4962]: I0220 11:23:03.906545 4962 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:23:03 crc kubenswrapper[4962]: I0220 11:23:03.907641 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" podUID="fd1e5da5-b553-419b-b874-baa4d0b09f1d" containerName="copy" containerID="cri-o://51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" gracePeriod=2 Feb 20 11:23:03 crc kubenswrapper[4962]: I0220 11:23:03.915664 4962 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4hjrx/must-gather-vfvhw"] Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.363992 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4hjrx_must-gather-vfvhw_fd1e5da5-b553-419b-b874-baa4d0b09f1d/copy/0.log" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.364656 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.461063 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") pod \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.461530 4962 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") pod \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\" (UID: \"fd1e5da5-b553-419b-b874-baa4d0b09f1d\") " Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.468020 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68" (OuterVolumeSpecName: "kube-api-access-5xc68") pod "fd1e5da5-b553-419b-b874-baa4d0b09f1d" (UID: "fd1e5da5-b553-419b-b874-baa4d0b09f1d"). InnerVolumeSpecName "kube-api-access-5xc68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.527632 4962 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4hjrx_must-gather-vfvhw_fd1e5da5-b553-419b-b874-baa4d0b09f1d/copy/0.log" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.528111 4962 generic.go:334] "Generic (PLEG): container finished" podID="fd1e5da5-b553-419b-b874-baa4d0b09f1d" containerID="51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" exitCode=143 Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.528177 4962 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hjrx/must-gather-vfvhw" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.528188 4962 scope.go:117] "RemoveContainer" containerID="51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.562866 4962 scope.go:117] "RemoveContainer" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.563471 4962 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xc68\" (UniqueName: \"kubernetes.io/projected/fd1e5da5-b553-419b-b874-baa4d0b09f1d-kube-api-access-5xc68\") on node \"crc\" DevicePath \"\"" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.595980 4962 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fd1e5da5-b553-419b-b874-baa4d0b09f1d" (UID: "fd1e5da5-b553-419b-b874-baa4d0b09f1d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.637887 4962 scope.go:117] "RemoveContainer" containerID="51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" Feb 20 11:23:04 crc kubenswrapper[4962]: E0220 11:23:04.638649 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be\": container with ID starting with 51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be not found: ID does not exist" containerID="51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.638703 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be"} err="failed to get container status \"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be\": rpc error: code = NotFound desc = could not find container \"51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be\": container with ID starting with 51533ca5277128daf61df4bcb8daa9914ee61cf20e1bd6102ffbad1536d290be not found: ID does not exist" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.638734 4962 scope.go:117] "RemoveContainer" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" Feb 20 11:23:04 crc kubenswrapper[4962]: E0220 11:23:04.639428 4962 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592\": container with ID starting with 68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592 not found: ID does not exist" containerID="68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.639460 4962 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592"} err="failed to get container status \"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592\": rpc error: code = NotFound desc = could not find container \"68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592\": container with ID starting with 68f8a74369dcc7ed292d418118d6a2654645c30c61a6d6f89b9a33287e7f6592 not found: ID does not exist" Feb 20 11:23:04 crc kubenswrapper[4962]: I0220 11:23:04.664662 4962 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fd1e5da5-b553-419b-b874-baa4d0b09f1d-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 20 11:23:05 crc kubenswrapper[4962]: I0220 11:23:05.147689 4962 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd1e5da5-b553-419b-b874-baa4d0b09f1d" path="/var/lib/kubelet/pods/fd1e5da5-b553-419b-b874-baa4d0b09f1d/volumes" Feb 20 11:23:17 crc kubenswrapper[4962]: I0220 11:23:17.908128 4962 scope.go:117] "RemoveContainer" containerID="8b9ab6691837647b0967e622ecfd0e62f3ce7907b1cef344ccbbdd1bcb192e5e" Feb 20 11:25:11 crc kubenswrapper[4962]: I0220 11:25:11.508031 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:25:11 crc kubenswrapper[4962]: I0220 11:25:11.508538 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:25:41 crc kubenswrapper[4962]: I0220 11:25:41.508578 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:25:41 crc kubenswrapper[4962]: I0220 11:25:41.509448 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.508900 4962 patch_prober.go:28] interesting pod/machine-config-daemon-m9d46 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.509689 4962 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.509760 4962 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.510832 4962 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26"} pod="openshift-machine-config-operator/machine-config-daemon-m9d46" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 11:26:11 crc kubenswrapper[4962]: I0220 11:26:11.510931 4962 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" podUID="751d5e0b-919c-4777-8475-ed7214f7647f" containerName="machine-config-daemon" containerID="cri-o://d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26" gracePeriod=600 Feb 20 11:26:12 crc kubenswrapper[4962]: I0220 11:26:12.259844 4962 generic.go:334] "Generic (PLEG): container finished" podID="751d5e0b-919c-4777-8475-ed7214f7647f" containerID="d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26" exitCode=0 Feb 20 11:26:12 crc kubenswrapper[4962]: I0220 11:26:12.259894 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerDied","Data":"d003c7c35366337d0b715af71a425c9e815e66375dbb460fb9c9b8c7941f2e26"} Feb 20 11:26:12 crc kubenswrapper[4962]: I0220 11:26:12.260481 4962 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-m9d46" event={"ID":"751d5e0b-919c-4777-8475-ed7214f7647f","Type":"ContainerStarted","Data":"0b314312dcb2f11e67f59432c45683ba13a50c0ae9ba6a5dd639de24db881b08"} Feb 20 11:26:12 crc kubenswrapper[4962]: I0220 11:26:12.260517 4962 scope.go:117] "RemoveContainer" containerID="11d1cd33fef2b28672ddeedcaa2789f4c1e5d261ff35b49d9ad217e63e8e6bae" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515146042370024447 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015146042370017364 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015146027162016511 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015146027162015461 5ustar corecore